<%BANNER%>

Ontology-Based Customizable 3D Modeling for Simulation

Permanent Link: http://ufdc.ufl.edu/UFE0010095/00001

Material Information

Title: Ontology-Based Customizable 3D Modeling for Simulation
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0010095:00001

Permanent Link: http://ufdc.ufl.edu/UFE0010095/00001

Material Information

Title: Ontology-Based Customizable 3D Modeling for Simulation
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0010095:00001


This item has the following downloads:


Full Text











ONTOLOGY-BASED CUSTOMIZABLE 3D MODELINTG FOR SIMULATION


By

MINHO PARK
















A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA


2005

































Copyright 2005

by

Minho Park





























To my parents, my lovely wife Suwon, and our daughter Emily
















ACKNOWLEDGMENTS

I would like to express my deepest gratitude to my advisor, Dr. Paul A. Fishwick,

who gave me inspiration and guidance throughout my Ph.D. studies at the University of

Florida. I would also like to give my sincere appreciation to my Ph.D. committee

members, Dr. Joachim Hammer, Dr. Beverly Sanders, Dr. Sherman Bai, and Dr.

Abdelsalam A. Helal, for their precious time and advice for my research.

I appreciate all the colleagues in our research group for sharing valuable ideas.

Special thanks go to Jinho Lee and Hyunju Shim for their help and companionship.

Also, I am grateful to the National Science Foundation and the Air Force Research

Laboratory for their financial support for my studies in the United States.

I owe great love to my parents, who prayed for and encouraged me throughout my

studies, and my wife, Suwon, and my daughter, Emily (Soyeon), who shared all the

wonderful and difficult moments with me here in Gainesville. They are the reason for my

existence.




















TABLE OF CONTENTS


page

ACKNOWLEDGMENT S ................. ................. iv.............


LI ST OF T ABLE S ................. ................. viii............


LI ST OF FIGURE S .............. .................... ix


AB STRAC T ................ .............. xii


CHAPTER


1 INTRODUCTION ................. ...............1.......... ......


1.1 M otivations and Challenges .............. ...............1.....
1.2 Contributions to Knowledge ............... ... .. .......... .. .. ...............3..
1.2. 1 Provide an Integrated Modeling and Simulation Environment ................... .3
1.2.2 Ontology Management and Model Database Construction..........................4
1.2.3 Present an Interaction Model Concept .............. ...............6.....
1.2.4 Improvements in the Visual Model Construction .............. ....................7
1.3 Organization of the Dissertation ................. ...............9...............

2 BACKGROUND .............. ...............10....


2.1 Dynam ic M odels................ ......... ...... .....................1
2.2 RUBE: An XML-based 2D and 3D Modeling Framework for Simulation. .........12
2.3 Ontology ................. ...............18.......... .....

3 INTEGRATIVE MULTIMODELING ................. ...............22........... ....


3.1 Approach............... ...............22
3.1.1 Aircraft Class ................. ...............24........... ..
3.1.2 Sensors Class ................. ...............26........... ..
3.1.3 Process and Display Classes................... .. .......... ...............2
3.1.4 Geometry Model, Dynamic Model, and Information Model Classes ........28
3.1.5 Interaction Model Class............... ...............28.
3.1.6 Summary............... ...............29
3.2 M ethodology ................. ...............30........... ....

4 IMPLEMENTATION................ ............3












4. 1 Overvi ew ................. .... ...... .... .. .............. ................3
4.2 Ontology: A Framework for Encoding Modeling Knowledge ................... ..........33
4.2.1 Cl asses .............. ...............36....
4.2.2 Properties ................. ...............37.................
4.3 Blender Interface .............. ...............39....
4.3.1 M odel Architecture............... ..............4
4.3.2 Model Explorer............... ...............43
4.3.3 Ontology Explorer ................. ...............46.......... .....
4.3.4 Export ................. ...............51................
4.3.5 Simulation ................. ...............52........... ....
4.2.6 Blender Game Logic................ ...............54.


5 CASE STUDY ................. ...............55................


5.1 A M military Application .............. .. ..... ... ......... .......5
5.1.1 Modeling for Geometry and Dynamic Models .............. .....................5
5.1.2 Modeling for Interaction Model ................. ...............57..............
5.1.3 Code Generation............... .... .... ...............6
5.1.4 Integrative Multimodeling with Simulation ................ ............ .........63
5.2 A Light Bulb Application ........._... ......___ ...............65..
5.2. 1 Ontology ........._..... ... .._.__ ....._.._... ..........6
5.2.2 Modeling for Geometry and Dynamic Models .............. .....................6
5.2.3 Modeling for Interaction Model .............. ...............69....
5.2.4 Code Generation............... .... .... ...............7
5.2.5 Integrative Multimodeling with Simulation ................ ............ .........72


6 CONCLUSION............... ...............7


6.1 Summary of Results............... ...............7
6.2 Future Research .............. ........ ..............7
6.2.1 Ontological Domain Extension .............. ...............77....
6.2.2 Model Type Extension .............. ...............80....
6.2.3 Interface Construction .............. ...............8 1....
6.2.4 Visual Programming Support ...._ ......_____ .......___ .............


APPENDIX


A BLENDER INTERFACE SOURCE .............. ...............82....


A. 1 Graphical User Interface ............. ...... ._ ...............82...
A.2 MXL Creation............... ...............85
A.3 OWL Management .............. ...............91....
A.4 Snap to Grid ................. ...............98.......... .....


B LIGHT BULB EXAMPLE SOURCES .............. ...............107....


B .1 D X L ................ ......... ............10
B.2 Simulation Code in Python ............_......__...._ ............1












C OWL AND XSLT SOURCES ................. ...............134........... ...


C. 1 Example 1: Battle Scene .............. ...............134....
C.2 Example 2: Light Bulb .............. .........__ ...............137.
C.3 Combined OWL: FSM and Scene Ontologies ................. ......__. ........._..140
C.4 Parser in XLST................ ...............146.


LIST OF REFERENCES ............._ ......... ...............149....


BIOGRAPHICAL SKETCH ..........._..__....._.. ...............155....


















LIST OF TABLES

Table pg

2-1. Mapping Rules............... ...............20.

4-1. Properties ................. ...............38......._.......




















LIST OF FIGURES


Figure pg

2-1. Declarative Model ........._.._ ..... ._._ ...............11...

2-2. Functional Model ........._.._ ..... .___ ...............11....


2-3. RUBE Framework ................. ...............13.......... .....


2-4. An FSM Describing a Four-Stroke Gasoline Engine [25] .............. .....................1


2-5. MXL Representation (FBM) for the Example .............. ...............15....

2-6. MXL Representation (FSM) for the Four-Stroke Gasoline Engine..............._. ........16


2-7. DXL Representation for the Four-Stroke Gasoline Engine ................. ................. .17


2-8. Simulation Output from the 2D Engine Model .............. ...............18....

2-9. Simulation Results .........__.. ..... .___ ...............18....


2-10. A Primitive FSM Ontology .............. ...............19....


2-11. OWL Representation for the FSM. ......__....._.__._ ......._._. .........2


3-1. Scene Ontology............... ...............24


4-1. The Overall Structure for the Integrative M2dutimodeling Environment ........._..........33

4-2. Scene Ontology............... ...............34


4-3. OWL Representation ............ ..... .__ ...............35..


4-4. Class Definitions in Protege .............. ...............37....


4-5. Property Definitions in Protege .............. ...............38....

4-6. Blender Environment. ........._.___..... .__. ...............40...


4-7. Model Architecture............... ..............4


4-8. FSM with Tank-Pipe Metaphor ........._._._ ...._. ...............42.












4-9. FSM with Primitives............... ...............4


4-10. A Snapshot of2~odel Explorer ............ ......__ ...............44


4-1 1. Point a Source and a Target Obj ects ........._.___..... .___ ...............4


4-12. Select a Connection Obj ect. ........._.___..... .__. ...............45.

4-13. Connection............... ...............4


4-14. Select a Connection Obj ect. ........._.___..... .__. ...............46.

4-15. Connection............... ...............4


4-16. A Snapshot of Ontology Explorer .............. ...............47....


4-17. Pop-Up M enu .............. ...............48....

4-18. Instance Creation ........._.___..... .__. ...............49..


4-19. hasDynamic Property Creation............... ...............49


4-20. hasGeometry Property Creation .............. ...............49....


4-21. File Name and Location Specification ............ ....._ ...............50

4-22. New Instance .............. ...............50....


4-23. Import Process ............ ..... ._ ............... 1...


4-24. GUI for Export ................. ...............52................


4-25. MXL and Function Definitions for the Example............... ...............52

4-26. GUI for Simulation............... ...............5


4-27. Blender Game Logic................ ...............54.


5-1. 2D Dynamic FBM Representation of the Combat Scene............._._ ........_._......56


5-2. 2D Dynamic Model (FBM) Representation of Interaction Model .................. ...........58


5-3. Interaction Model for the Fl5 (Geometry Obj ect) ................ ......................._59


5-4. Interaction Model for the Fl5 (Dynamic Obj ect) ....._____ ............. ..............59


5-5. MXL for the Example............... ...............61


5-6. DXL for the Example .............. ...............62....












5-7. Initial Scene (Geometry Model) ................. ....__. ....._. ...........6

5-8. Scene Prior to Interaction .............. ...............64....


5-9. Model Morphing............... ...............64


5-10. Dynamic M odel .............. ...............64....


5-11. Light Bulb ........._..... ......_. ...............65..


5-12. FSM Representation for the Light Bulb Example ........._.. .........._ ........._...66


5-13. Multimodel Representation for the Light Bulb Example .............. ....................6


5-14. Light Bulb Ontology............... ...............67


5-15. 2D Dynamic Model (FBM) Representation of Interaction Model .................. .........69

5-16. Interaction Model for State 3 ............ ..... .__ ...............70


5-17. MXL for the Example............... ...............71

5-18. MXL for FSM ................. ...............72........... .


5-19. Scene Prior to Interaction (Geometry Model) .............. ...............73....


5-20. Model Morphing............... ...............73


5-21. Dynamic M odel .............. ...............73....


5-22. Two Model Types (Geometry and Dynamic Models)............... ...............74


6-1. Extended OWL Representation ............ ......__ ...............78.
















Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

ONTOLOGY-BASED CUSTOMIZABLE 3D MODELINTG FOR SIMULATION

By

Minho Park

May, 2005

Chair: Paul A. Fishwick
Major Department: Computer and Information Science and Engineering

Modeling techniques tend to be found in isolated communities: geometry models in

computer-aided design (CAD) and computer graphics, dynamic models in computer

simulation, and information models in information technology. When models are

included within the same digital environment, the ways of connecting them together

seamlessly and visually are not well known even though elements from each model have

many commonalities. We attempt to address this deficiency by studying specific ways in

which models can be interconnected within the same 3D space. For example, consider

this scenario: a region with several key military vehicles and targets: planes (both fighter

as well as command and control center), surface-to-air missile (SAM) sites, and drones.

A variety of models define the geometry, information, and dynamics of these obj ects.

Ideally, we can explore and execute these models within the 3D scene by formalizing

domain knowledge and providing a well-defined methodology.

We present a modeling and simulation methodology called integrative

naultintodeling. The purpose of integrative naultintodeing is to provide a human-computer









interaction environment that allows components of different model types to be linked to

one another--most notably dynamic models used in simulation to geometry models for

the phenomena being modeled. In the context of integrative multimodeling, the following

general issues naturally arise: 1) "How can we connect different model components?"; 2)

"How can we visualize different model types in 3D space?"; and 3) "How can we

simulate a dynamic model within the integrative multimodeling environment?"

For the first issue, we have defined a formalized scene domain to bridge semantic

gaps between the different models and facilitate mapping processes between the

components of the different models by conceptualizing all obj ects existing in the scene

domain using semantic languages and tools.

For the second issue, we have developed a Python-based interface to provide

visualization environments. Using the interface, users can visualize and create their own

model types as well as construct a model component database.

For the third issue, we have employed the RUBE framework, which was developed

by the Graphics, Modeling and Arts (GMA) Laboratory at the University of Florida.

RUBE is an extensible markup language (XML)-based Modeling and Simulation

framework and application, which permits users to specify and simulate a dynamic

model, with an ability to customize a model presentation using 2D or 3D visualizations.

In addition, human-computer interactions are needed to achieve integrative

multimodeling between multiple models. To facilitate the integrative multimodeling, the

interactions should be formalized and executable in the environment. Therefore, the

concept of interaction model for integrative multimodeling is created and represented as a

function block model (FBM) for formalizing human-computer interactions.









This work to date has resulted in an environment permitting users to explore

dynamic model structure through interactions with geometric scene structure.















CHAPTER 1
INTTRODUCTION

1.1 Motivations and Challenges

A model is a simplified representation of the real entity created in order to increase

our understanding of that entity. Modeling is the process of making the model. Various

modeling techniques and methodologies have been introduced and applied to real-world

applications in isolated communities, such as computer graphics and CAD, computer

simulation, and information technology. In computer graphics and CAD, researchers

focus on geometric modeling methods [1, 2], and on system behavioral modeling

approaches [3-6] in computer simulation and information modeling approaches [7, 8] in

information technology.

Even though researchers in diverse areas have their own distinct modeling

environments and concepts, they commonly describe a real world based on different

perspectives under different environments (interfaces) with different storage systems.

Through the modeling process, the real world could be expressed in diverse model types,

such as a geometry model, a dynamic model, or an information model. Ideally, we can

explore and execute these models within a unified 3D scene that integrates diverse

models.

Our research was started by posing five general questions:

1. Is there any way we can include different model types within the same
environment?

2. How can we connect them together seamlessly, visually, and effectively?









3. How can we overcome semantic heterogeneity between the models?

4. How can we manage modeling knowledge?

5. How can we control the model presentations under the environment?

We found that the ability to create customized 3D models, effective ontology

construction, and human computer interaction (HCI) techniques [9-11] help to blend and

stitch together different model types.

We present a new modeling and simulation methodology called integrative

multimodeling [ 12-14]. The purpose of integrative multimodeling is to provide a human

computer interaction environment that allows components of different model types to be

linked to one another.

To support integrative multimodeling, the open source 3D Blender software is

employed as a comprehensive modeling and simulation tool [15-17]. An ontology web

language (OWL) [18], which is a language for processing web information, is used to

bridge semantic gaps between the different models, facilitate mapping processes between

the components of the different models, construct a model component database, and

manage modeling knowledge.

In addition, we introduce an interaction model for human computer interactions.

The interaction model can be implemented and executed within a Blender environment or

a virtual reality modeling language (VRML) environment.

To construct and simulate dynamic models, we use the RUBE framework, which is

an XML-based modeling and simulation tool [19-22] developed by our research group. In

RUBE, two XML languages, multimodel exchange language (MXL) [23] and dynamic

exchange language (DXL) [24], were designed to capture the semantic content for the

dynamic models.









1.2 Contributions to Knowledge

1.2.1 Provide an Integrated Modeling and Simulation Environment

In the area of modeling and simulation, Dynamnic and Static models can be defined

as one of the model classifications [25-27]. A dynamic model, which is compared with a

static model, represents a system as model variables in the system evolve over time. We

can represent the dynamic behaviors as one of the model types, such as functional block

model (FBM), finite state model (FSM), equation model, system dynamics model, or

queuing model. On the other hand, a variety of geometric model types are classified in a

graphics and CAD community. For example, boundary representations, constructive solid

geometry (CSG), and wireframe models are used to represent geometric models [28-30].

Ideally, we can blend and stitch together different model types so that users or

modelers can explore and execute these models within a unified 3D scene. We started to

investigate ways in which we can integrate geometry and dynamic models within the

same scene environment. As a result, an integrated modeling and simulation environment

for geometry and dynamic models has been developed for supporting the viewing and

interaction associated with multiple models under a Blender 3D environment and

simulating a dynamic model as well as animating a geometry model in the same digital

environment. To build up the integrated environment, Blender Interface, Blender Game

Engine, and RUBE framework are employed.

As a related work, LabVIEW [31] is a fully featured graphic-based application

development tool produced by National Instruments. It has been used for visualizing data

flow as well as analyzing and controlling data by providing Vision Development and

Real-Time modules. However, LabVIEW can only support 2D-based graphical

environment and does not provide a module for handling ontologies.









1.2.2 Ontology Management and Model Database Construction

An ontology represents a formal conceptualization of a domain. Ontology

languages, such as resource description framework (RDF) [32, 33] and OWL, and tools,

such as Protege [34, 35] and semantic web ontology overview and perusal (SWOOP)

[36], have been developed for representing knowledge and information about a certain

specific domain.

We developed the ontology management tool, Ontology Explorer, for manipulating

an ontology by allowing users to create class instances for constructing a model

component database and to generate classes or subclasses within the OWL-based

ontology for managing the OWL ontology without leaving the environment. In addition,

Ontology Explorer provides an alternate modeling process for geometry and dynamic

models by reusing geometry and dynamic model obj ects. The brief reviews for ontology

and ontology-related work are necessary to understand Ontology Explorer.

The Semantic Web [37] technologies, such as ontology languages involving RDF,

resource description framework-schema (RDF-S) [38], and OWL, are employed in a

variety of communities to share or exchange the information, as well as to deal with

semantic gaps between different domains. In the information systems community,

including the database systems community, ontologies are used to achieve semantic

interoperability in heterogeneous information systems by overcoming structural

heterogeneity and semantic heterogeneity between the information systems [39-42]. In a

simulation and modeling community, Miller, Sheth, and Fishwick propose the Discrete-

event Modeling Ontology (DeMO) [43, 44] to facilitate modeling and simulation. To

represent core concepts in the discrete-event modeling domain, they define four main

abstract classes in the ontology: De2~odel, M~odelConcepts, M~odelComponents and










M\~odel2\~echanisms. The De2~odel class defines general model types such as Petri-Net and

Markov. Corresponding model elements are described in M~odelComponents using "has-

a" relationships. In M~odelConcepts, they define fundamental concepts used in

constructing dynamic models such as State, Event, and Tokens. Diverse modeling

techniques, such as Event-based and Transition-based, are conceptualized in

M\~odel2\~echanisms. The DeMO is general domain-based ontology for discrete-event

modeling, while our scene ontology is application-based and instance-based for

integrative multimodeling, which will be shown in the next section. Liang and Paredis

[45] define a port ontology to capture both syntactic and semantic information for

allowing modelers to reason about the system configuration and corresponding

simulation models.

As OWL-based ontology editor tools, Protege and SWOOP are used for

development of OWL ontologies. Protege editor is a powerful tool which allows users to

construct a domain ontology and create instances of OWL classes. And many plug-ins

such as ezOWL [46] and OntoViz [47] support the Protege editor. On the other hand,

SWOOP is a web-based development tool for ontologies. It is a simple and handy tool

compared to Protege editor. However, it does not support the functionality for instance

creation.

We create OWL-based ontologies to define application domains and modeling

knowledge, to bridge semantic gaps between the different models, and to facilitate

mapping processes between the components of the different models. And Ontology

Explorer has been developed for supporting a modeling and simulation process within the

Blender environment and providing functionality of ontology editor.









The distinct points, compared to any other related work in a simulation and

modeling community, are that 1) the OWL-based ontologies are used for building a

model component repository (i.e., Model Database) for geometry and dynamic models to

increase reusability of model components, and 2) Ontology Explorer has an ontology

editor functionality.

1.2.3 Present an Interaction Model Concept

Human-computer interactions are needed to achieve the integrative multimodeling

environment. To facilitate the integrative multimodeling, the interactions should be

formalized and executable in the Blender environment. Therefore, the concept of

interaction model is created and represented as a function block model (FBM) for

formalizing human-computer interactions. And Blender logic graphs are utilized for

modeling the interaction model and executing the model.

HCI techniques play an important role in integrative M~ultimodeling to connect and

visualize different model types together seamlessly within the same digital environment.

Many techniques are being used for supporting user interactions in 3D space. If we

consider a desktop-based interaction environment, the keyboard and mouse are the

primary interaction devices. Therefore, interaction methods should be sensor-based or

scripting-based. Using the interaction devices and interaction methods, modelers or users

visualize or navigate their 3D worlds. Diverse interaction methods are found in the

literature, such as toolbar-based, Windows-based, button-based, or scripting-based user

interactions. For example, Campbell and his colleagues [48] have developed a virtual

geographical information system (GIS) using GeoVRML and Java 3D software

development packages. They employ a menu bar and toolbars for ease of use because

most users immediately understand how to use the menu bar and toolbars. Cubaud and










Topol [49] present a VRML-based user interface for a virtual library, which applies 2D

Windows-based interface concepts to a 3D world. They allow users to move, minimize,

maximize, or close Windows by dragging and dropping them or by pushing a button,

which is usually provided in a traditional Windows system environment. Lin and Loftin

[50] provide a functional virtual reality (VR) application for geoscience visualization.

They employ virtual button and bounding box concepts to interact with geoscience data.

If interaction is needed, all the control buttons on the frame can be visible; otherwise they

are set to be invisible so that the frame simply acts as the reference outside the bounding

box. Hendricks, Marsden, and Blake [51] present a VR authoring system. The system

provides three main modules, graphics, scripting, and events modules, for supporting

interactions. If we consider all interaction methods previously described, the possible

interaction ways within a desktop-based environment are "Virtual Button," "Windows,"

and "scripting-based interaction" approaches. In the "Virtual Button" and "Windows"

cases, we can implement the concepts using "touch sensor" or "IndexedFaceSet." If we

employ an additional technology, such as Hypertext Preprocessor in VRML, "scripting-

based interaction" could be a possible method.

The open source Blender package provides powerful sensor-based interaction

methods. In Chapter 5, we will introduce and explain the sensor-based interaction

approaches using two examples.

1.2.4 Improvements in the Visual Model Construction

RUBE, which is developed by the Modeling and Simulation Research Group

within the Graphics, Modeling and Arts (GMA) Laboratory at the University of Florida,

provides a web-based Modeling and Simulation environment that enables modelers to

define, create, and execute a customized or personalized dynamic model, such as a finite









state machine (FSM), a functional block model (FBM), or a multimodel containing

multiple dynamic models (i.e., FBM and FSM). However, the primary disadvantage of

RUBE is that it is a little cumbersome to build models. It does not have library concepts

for model components and functions. Whenever users want to build their models in the

RUBE environment, they have to create model objects and the corresponding functions in

every modeling process even though obj ects and functions are frequently used in

modeling process. Another problem is the issue of topological connectivity, which must

be manually specified in modeling process. A solution is to induce the topology from the

geometry using an intelligent algorithm.

As a solution for the first problem in RUBE, we develop M~odel Explorer and

provide library systems for dynamic model obj ects, MXL, and functions for users to

facilitate dynamic modeling process. Model Explorer allows users to search or import

proper Blender obj ects from the obj ect library. If users need their own customized or

personalized obj ects, they are able to generate certain obj ects and store them into the

obj ect library using Export in Blender hIterface.

A "snap to grid" algorithm will be introduced in Chapter 5 as a solution for the

second problem. The algorithm makes scaling, rotation, and transformation matrices for

the connection obj ects such as arrows and/or pipes using vector operations. When given

non-connection-related obj ects in a 3D scene, users do not need to relocate, resize, and

re-rotate any connection obj ects. This indicates that the algorithm can place the

connection obj ects exactly between the source and target blocks according to user

requests.









1.3 Organization of the Dissertation

Our approach and methodology for integrative multimodeling is explained and

discussed in Chapter 3. Background knowledge, such as the concept of dynamic model,

RUBE (XML-based Modeling and Simulation Framework), and ontology, is addressed

and discussed in Chapter 2. The integrative mutimodeling environment is presented in

Chapter 4. In Chapter 5, we demonstrate how the methodology is applied to real-world

applications using two examples. We conclude the dissertation by discussing future

research in Chapter 6.















CHAPTER 2
BACKGROUND

2.1 Dynamic Models

Before we discuss our modeling approach and methodology, let us overview a

unifying formalism that serves to represent a wide variety of system models. A

deterministic system within classical systems theory is defined as

follows [25]:

* T is the time set. For continuous systems, T = R (reals), and for discrete time
systems, T = Z(integers).

* U is the input set containing the possible values of the input to the system.

* Y is the output set.

* Q is the state set.

* R is the set of admissible (or acceptable) input functions. This contains a set of
input functions to use during system operation. Often, due to physical limitations,
R is a subset of the set of all possible input functions (T- U).

* 8 is the transition function. It is defined as 6: Q x R 0 Q.

* h is the output function, h : Q -Y.

A pair consisting of a time and a state (t, s), where s E Q, is called an event. Events

are points in event space just as states are points in state space. Event space is defined as

Tx Q. State and Event are critical aspects of any system and by focusing on one or the

other, we form two different sorts of models: declarative models that focus on the

concept of state, and~fm~ctional models that focus on the concept of event. In declarative

modeling, as shown in Figure 2-1, in 2D diagram, we build models that focus on state





representations and state-to-state transitions. In functional modeling as shown in Figure

2-2, we focus on the system as a coupled network of functions each of which takes inputs

and produces outputs.


U


V


Figure 2-1. Declarative Model


Figure 2-2. Functional Model

The declarative modeling approach suggests that we look at the world through a

sequence of changes in state and it is very good for modeling problem domains where the

problem decomposes into discrete temporal phases. Finite State Automata (FSA) permit

us to model systems with a minimum of components. An FSA is a system with states and

transitions. A state, depicted as a circle, represents the current condition of a system for









some length of time, and transitions, depicted as arrows, enable the system to move from

one state to another during the simulation. The basic definition of system
6, iZ> provides the semantics for declarative models as described above for the

deterministic system.

Functional models map very easily into the OO paradigm since a function is

represented as a method located within an obj ect, and it is often used to model continuous

systems and discrete event systems composed of coupled functional blocks.

These functional models are also useful when the problem is given in terms of

distinct physical obj ects which are connected in a directed order or the problem involves

a material flow throughout the system. Function-based models contain a transfer function

which relates input to output. The functions, along with inputs and outputs, are often

depicted in a "block" form, especially when a block is the iconic representation of

physical device being modeled.

We discuss mapping rules between the formal definition of system and ontological

domain knowledge in Section 2.3 by showing how each element in the description could

be connected to domain knowledge.

2.2 RUBE: An XML-based 2D and 3D Modeling Framework for Simulation

We have developed a dynamic modeling and simulation framework called RUBE

for the past four years [52, 53]. The purpose of RUBE is to facilitate dynamic multimodel

construction and reuse within a 2D environment [54] or a 3D immersive environment

[55].

A simplified overall structure of the XML-based RUBE framework is shown in

Figure 2-3. RUBE is an XML-based framework and application, which permits the users

to specify and simulate a dynamic model, with an ability to customize a model










presentation using 2D or 3D visualization. In Figure 2-3, the RUBE framework is defined

using two stages: model representation and model creation.

For model representation, a dynamic model, which is generated by a 2D or a 3D

interface, is composed of two sorts of files: a scene file, which contains 2D or 3D

geometry obj ects, and a model file, which is represented by MXL. RUBE uses Sodipodi

[56] and Blender [57] for each representation of the 2D or the 3D model scenes and

behaviors. The scene files don't have any information about model behavior or dynamics

except regarding the appearance of the model. The scene files can be either 2D or 3D

XML documents: SVG (Scalar Vector Graphics) [58] or X3D (eXtensible 3D) [59]

respectively. The MXL file describes the behavior of the model to represent the model

file that describes a heterogeneous multimodel in an abstract level such as FBM

(Functional Block Model) and FSM (Finite State Machine).


2SD interim) c nefl Merg ee ine RUBE dy amic
modeling file
3D interface
S MXL file Simulation file
(Blender)

Simulation file
MXLtoDXL DXLtoJavascript
MXL file translator using DXL file translator using Siulation cd


SimP ckJ/S


Figure 2-3. RUBE Framework

For model creation, a 2D or a 3D merge engine, which uses XSLT (eXtensible

Stylesheet Language Transformation) [60], merges two XML documents: a scene file and

a model file. For model simulation, the MXLtoDXL translator translates a model file

written in MXL into a low-level modeling language called DXL, which can be described

with homogeneous simple block diagrams. The DXL is translated into an executable







14


programming code for the model simulation using the DXLtoJavascript translator. The

programming code can be executed using SimpackJ/S [61, 62], which provides the

underlying code foundation libraries, classes, and obj ects.

For example, a four-stroke gasoline engine has four phases, which cycle until the

engine is turned off. The four phases are Compression, ignition, Expansion, and

Exhaustion. In this gasoline engine, inj ected fuel vapor is compressed by the piston of the

cylinder (Compression) and then the fuel is ignited ignitionn). As a result of the ignition,

the piston of the cylinder moves back (Expansion). The resulting fumes exit through the

exhaust manifold (Exhaustion).


Engine FSM

Ignition


Cmpre 1 Of Expan


Exhaust






Figure 2-4. An FSM Describing a Four-Stroke Gasoline Engine [25]

The behavior of the engine can be described as the FSM in Figure 2-4. While an

input value of 0 will let the engine stay at the current state, the value 1 causes the engine

to move to the next state. An input value of 2 in the ignition state indicates that the

ignition is turned off (Off state).

The engine system can also be modeled with a pipeline of 3 functions to create an

FBM. To execute the engine, input to and output from the engine must be specified from

the outside world. The engine can be embedded inside of the FBM block with one input











and one output block associated with it. The input block feeds the input value to the


engine. The engine block represents an engine model that behaves as described in Figure

2-4. The current state of the engine is passed to the next block, namely the output block.

From the model representation (2D or 3D), we can generate an MXL Hile for the


example. Figures 2-5 and 2-6 show the MXL representations for the example.