Citation
DRISHTI: AN INTEGRATED NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED AND DISABLED

Material Information

Title:
DRISHTI: AN INTEGRATED NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED AND DISABLED
Copyright Date:
2008

Subjects

Subjects / Keywords:
Coordinate systems ( jstor )
Databases ( jstor )
Global positioning systems ( jstor )
Grammar ( jstor )
Java ( jstor )
Mobile devices ( jstor )
Navigation ( jstor )
Personal computers ( jstor )
Visually impaired persons ( jstor )
Walkways ( jstor )

Record Information

Source Institution:
University of Florida
Holding Location:
University of Florida
Rights Management:
Copyright the author. Permission granted to the University of Florida to digitize, archive and distribute this item for non-profit research and educational purposes. Any reuse of this item in excess of fair use or other copyright exemptions requires permission of the copyright holder.
Embargo Date:
8/8/2002
Resource Identifier:
51549698 ( OCLC )

Downloads

This item is only available as the following downloads:


Full Text

PAGE 1

DRISHTI: AN INTEGRATED NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED AND DISABLED By STEVEN EDWIN MOORE A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2002

PAGE 2

Copyright 2002 by Steven Edwin Moore

PAGE 3

For my father who is blind yet still sees so much.

PAGE 4

ACKNOWLEDGMENTS I would like to thank my advisor, Dr. Abdelsalam (Sumi) Helal, for all his support and encouragement, and for giving me the opportunity to work on such an exciting project. I would like to thank Dr. Gerhard Ritter and Dr. Scot Smith for serving on my graduate committee and for all the encouragement they gave me. I would also like to thank my fellow students Balaji Ramachandran and Choonhwa Lee for their work on this project. Balaji worked alongside me from beginning to end, and Choonhwa helped get the system up and running. I would like to thank Dr. Loukas Arvanitis for giving me access to his GPS equipment and for always telling me to persevere. I would especially like to thank my wife, Alicia; my daughter, Sydney; and my son, Elias for their love, support and patience throughout this journey. I would also like to thank my father Dr Theral Moore for giving valuable input to this project and testing it out. Last but not least I would like to thank my mother, Nancy for being my mother. iv

PAGE 5

TABLE OF CONTENTS page ACKNOWLEDGMENTS ................................................................................................. iv LIST OF TABLES............................................................................................................ vii LIST OF FIGURES ......................................................................................................... viii ABSTRACT....................................................................................................................... ix CHAPTERS 1 INTRODUCTION ...........................................................................................................1 Empowering Technologies ............................................................................................. 2 Wearable Computing ............................................................................................... 2 Speech Synthesis and Voice Recognition................................................................ 3 Wireless Communication......................................................................................... 3 Geographical Information Systems.......................................................................... 4 Global Positioning System....................................................................................... 6 Three segments ..................................................................................................6 Operation............................................................................................................7 2 RELATED WORK..........................................................................................................9 3 METHODOLOGY ........................................................................................................13 Problem Domain........................................................................................................... 13 Proving Ground...................................................................................................... 14 Requirements and Functionality ............................................................................ 14 System Design .............................................................................................................. 17 Hardware Components........................................................................................... 18 Wearable Computer ............................................................................................... 18 Differential GPS (DGPS) Receivers...................................................................... 19 Wireless Network................................................................................................... 19 Software Components............................................................................................ 20 Spatial database................................................................................................20 Route server .....................................................................................................20 Vocal user interface .........................................................................................20 GIS Database.......................................................................................................... 21 System Architecture............................................................................................... 22 v

PAGE 6

Communication................................................................................................23 Client manager/proxy.......................................................................................25 Client................................................................................................................28 Navigation........................................................................................................34 4 ASSESSMENT..............................................................................................................37 Challenges..................................................................................................................... 37 GIS ......................................................................................................................... 37 COTS ..................................................................................................................... 38 Speech Interface..................................................................................................... 38 Issues............................................................................................................................. 38 Assessment Summary................................................................................................... 39 5 CONCLUSIONS AND RECOMMENDATIONS ........................................................40 Conclusions................................................................................................................... 40 Recommendations......................................................................................................... 40 LIST OF REFERENCES...................................................................................................42 BIOGRAPHICAL SKETCH .............................................................................................45 vi

PAGE 7

LIST OF TABLES Table page 3-1. User browses list of available destinations and then requests a route.......................21 3-2: GGA sentence field definitions .................................................................................31 4-1: Drishti functionality compared to related work.........................................................39 vii

PAGE 8

LIST OF FIGURES Figure page 1-1. Raster vs. vector ..........................................................................................................4 1-2. Feature geometries.......................................................................................................5 3-1. Campus study area.....................................................................................................13 3-2. Path generated by route server based on user's preference........................................15 3-3. Visual display of auditory cue...................................................................................15 3-4. Path buffered to walkway width................................................................................16 3-5. Simple path without any obstacles ............................................................................17 3-6. Path re-routed after a special events blockade...........................................................17 3-7. Wearable mobile client..............................................................................................18 3-8. Client/proxy/server architecture ................................................................................22 3-9. Negotiation to set up a connection for a client..........................................................24 3-10. Client manager architecture.....................................................................................26 3-11: Client components...................................................................................................29 3-12. Drishti unified path..................................................................................................35 viii

PAGE 9

Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science DRISHTI: AN INTEGRATED NAVIGATION SYSTEM FOR THE VISUALLY IMPAIRED AND DISABLED By Steven Edwin Moore August 2002 Chair: Abdelsalam (Sumi) Helal Department: Computer and Information Science and Engineering Drishti is a wireless pedestrian navigation system. It integrates several technologies including wearable computers, voice recognition and synthesis, wireless networks, Geographic Information System (GIS) and Global positioning system (GPS). Drishti augments contextual information to the visually impaired and computes optimized routes based on user preference, temporal constraints (e.g., traffic congestion), and dynamic obstacles (e.g., ongoing ground work, road blockade for special events). The system constantly guides the blind user to navigate based on static and dynamic data. Environmental conditions and landmark information queried from a spatial database along their route are provided on-the-fly through detailed explanatory voice cues. The system also provides capability for the user to add intelligence, as perceived by the blind user, to the central server hosting the spatial database. Drishti is supplementary to other navigational aids such as canes, guide dogs and wheelchairs. ix

PAGE 10

CHAPTER 1 INTRODUCTION Knowledge is power, or so the saying goes. The world is being filled with information and ways to disseminate this information into portable computing devices that can be used to help turn this information into knowledge and thus into power. My father is one of many people on the Earth who lack the power to navigate from one place to another independently because he is visually impaired. This work is a study to use maturing technologies to give power to those who need it. Commuting in a crowded environment is much more challenging for the blind than for a normal person. Even for sighted persons, navigating in an unknown environment requires some form of directional clues or maps to reach their destination. Visually impaired persons are at a disadvantage, because they do not have access to contextual information and spatial orientation of immediate proximity. For navigation, the knowledge structure typically used by the blind is a mixture of declarative and route knowledge [Golledge et al. 1998]. Often the blind must rely on repetitive and regular routes with minimal obstructions for their daily movement in a predefined area of the campus. Still these routes are not free from unexpected hazards or obstacles such as a puddle of water in a poorly drained area, a slippery walkway after a rain event, groundwork signs placed by a work crew, a broken tree limb, and sprinkler systems. To a certain extent, traditional navigation aids like long canes and Seeing Eye dogs provide assistance in such scenarios. Even such aids cannot detect overhanging broken tree 1

PAGE 11

2 branches to prevent accidents from occurring to the blind person. The biggest hurdle for blind and disabled people is to travel through unknown or dynamically changing environments. A school such as UF with constant athletic and special events requiring detours and road blockades can be characterized as a dynamically changing campus. The focus of this thesis is navigation aid for the blind and disabled. The goal of the work is to provide adaptive navigation support in such environments. The work done here to address this problem can easily be extended to support multiple applications such as routine building maintenance by physical plant crew, emergency response systems and tourist campus guide. Empowering Technologies Over the last few years, many technologies have converged to a mobile form or size. Wearable Computing A wearable computer lets the traveler go mobile with a sizable amount of computing power without being the focus of the traveler’s environment. It enhances the traveler’s environment but does not become the center of attention as a handheld computer might, since a handheld ties up more of the traveler’s resources (namely, their hands). Wearable computers offer three operational modes [Mann 2001]: Constancy: The computer is always on and is always ready to interact with the user. Augmentation: The assumption of wearable computing is that computing is not the primary task (in the traditional computing paradigm computing is the primary task). The user will be engaged in some other simultaneous activity while computing. Thus the goal of the wearable is to augment the senses and the intellect of the user. Mediation: The wearable computer can encapsulate the user and act as a mediator or filter of information on the way in to the user or out from the user. We are taking advantage of at least the first two modes that wearable computing offers over the traditional computing paradigm. We are also interested in the six signal paths of

PAGE 12

3 wearable computing which are Unmonopolizing, Unrestrictive, Observable, Controllable, Attentive, and Communicative [Mann 2001]. Speech Synthesis and Voice Recognition Many people lack the power afforded by vision. Speech synthesis (also known as text-to-speech) gives computers the ability to speak. It is the process of turning text produced by an application into audio sounds that can be understood by humans as the text, that is taking information meant for the eyes and converting it into a form designed for the ears. Voice recognition gives computers the ability to listen and thus allows users to provide input via voice in place of the usual graphical methods, which are tied tightly to vision. The process involves words spoken in the form of analog sound being captured by a microphone and then being converted into a digital format on a sound card. This digital format is then matched against phonemes by a voice-recognition engine from which words are matched, resulting in information that can be delivered to an application. The words that the recognition engine attempts to match, come from a grammar that has been loaded into the engine. There are two kinds of grammars. For command and control applications, we use a rule grammar. For continuous-speech applications, we use a dictation grammar. Rule grammars can be defined using the Java Speech Grammar Format (JSGF). Wireless Communication Wireless data coverage and access methods are growing in all directions. The campus here at the University of Florida is rolling out a 802.11b wireless LAN network that provides 11 Mbps of bandwidth. We also have access to an iDEN wireless packet data phone from Motorola offering a 40 Kbps raw data rate.

PAGE 13

4 Geographical Information Systems Commonly referred to as GIS is the technology that helps people convert spatial data into information, then into knowledge, and finally into power. Spatial databases add the power of Database Management Systems to GIS data. GIS data generally represents locations on the earth. There are two general formats used to model such spatial information: vector and raster. Raster data is cell based; vector data is coordinate based. Raste r Vecto r Figure 1-1. Raster vs. vector Figure 1-1 shows the two formats used to model an arc. For applications such as environmental monitoring, raster is the preferred format as it lets you keep track of small variations over continuous space. Raster is the model you would use when you are trying to detect boundaries between dense systems of gradual variation or find sources and sinks. For applications where well-defined boundaries are known vector is the preferred format. Examples of these types of applications are planning and networks. Since the goal of our application is to find optimal networks through campus and navigate along well-defined edges we are using vector data.

PAGE 14

5 Figure 1-2 shows examples of feature geometries, one component of a vector based GIS. Polyline Polygon Multipoint Poin t Figure 1-2. Feature geometries The other components of a vector based GIS are as follows: Attributes. Each of the features above could have a set of attributes associated with it. For example, the point could be the location of a picnic table on campus, with attributes material, date created, and number of seats. The actual spatial coordinates of the table are implicit attributes. Spatial relationship operators. A set of Boolean operators is available to test such spatial relationships as Line.Contain(Multipoint), this will return true if the Line is a superset of the Multipoint, false otherwise. Topological operators. A set of operators is available that return new geometries based on logical comparisons between sets of points in one or more geometries [Worboys 1995]. For example one often-used operator is Buffer. Given a geometry and a buffer distance, the buffer operator will return a polygon that covers all points whose distance from the geometry is less than or equal to the buffer distance.

PAGE 15

6 Attribute queries. A set of Structured Query Language (SQL) operators to select feature geometries based on their attributes. An example of such would be to select all walkways that are paved. With the components just mentioned we can do such things as take ones current location, create a point feature out of it, buffer the point feature by a distance of 100 feet, use this buffer in the following spatial relationship operator CurrentLocationBuffer.Overlaps(Buildings) to find all the buildings that are within 100 feet of our current location. We can further constrain our feature selection using attribute queries on Buildings. Another desirable and important component of a GIS, is topology. Topology in a GIS is the spatial relationships between feature geometries [Zeiler 1999]. Examples are a set of points that form a network; the network is the topology of the set of points. Another example is a Polyline has a direction and thus a right and left side. Global Positioning System Here again is a technology that is now supported on a small enough platform to be used to add location to a mobile system. Three segments The Navstar Global Positioning System is actually an expansive system employed in the late sixties by the US Department of Defense (DoD) for strategic military navigation. The system provides accurate, continuous, worldwide, three-dimensional position and velocity information to users. The GPS has three main segments:

PAGE 16

7 Space segment Consists of a constellation of 24 satellites maintained by the US government, the system is called Navstar. Each satellite is in a precise orbit approximately 20,200 kilometers above the earth. Control segment The heart of the system Consists of 5 Monitoring and Control Stations that continually monitor each satellite to keep it aware of its precise position in space and any clock drift that it may experience. It is supervised and managed by the DoD from Falcon Air Force Base, Colorado Springs, USA. User segment Several applications utilize GPS signals for mapping, navigation and many other tasks. Some of the civilian GPS applications are in precision agriculture, transportation, aviation, vehicle tracking, emergency response services, forestry, wildlife and other natural resource fields. Operation In terms of accuracy, the GPS can be broadly classified into three categories: survey (high), mapping and recreational (low). Main sources of errors in GPS signal are: clock, multipath, and atmospheric delay. Clock errors, which are due to the differences between the quality of satellite and receiver clocks. Satellites are equipped with very precise atomic clocks. The receivers depend on the reliability of their internal frequency standard, the oscillator [Van Sickle 1996]. Multipath error is caused when signals from the satellites are received after bouncing back from obstacles, such as buildings, ground

PAGE 17

8 and others. The apparent speed and direction of the signal from the satellites is affected by both, ionospheric refraction and diffraction.

PAGE 18

CHAPTER 2 RELATED WORK The work done in this thesis (on Drishti) is closely related to augmented reality and user interfaces. In augmented reality, the user’s real world experience is supplemented by a virtual world [Sutherland 1968]. In this case the virtual world is modeled through a GIS database. The user’s context with respect to his current location, obtained by GPS, is provided dynamically from this spatial database. Most of the interaction with the system occurs through vocal input, with auditory feedback for the blind and visual feedback for the disabled. Initial efforts in augmented reality dealt with see-through head mounted displays for assistance in applications like aviation, surgery, maintenance and repair; building restoration work and parts assembly [Feiner et al. 1997, Thomas et al. 1998]. The common feature of all these applications is the precise tracking of one object in relation to others. Further, such applications are usually restricted to small operating zones and tethered to a fixed network. In 1991, Golledge et al. were the first to propose the use of GIS, GPS, speech, and sonic sensor components for blind navigation in progress notes on the status of GIS. MOBIC, is a GPS based travel aid for blind and elderly. It also uses a speech synthesizer to recite the predetermined travel journey plans [Petrie et al. 1996]. This test prototype is implemented on a handheld computer with preloaded digital maps and limited wireless capabilities to get latest information from a remote database. A similar system was implemented by Golledge et al. (1998) using a wearable computer. Other terrestrial 9

PAGE 19

10 navigation support using augmented reality has been developed for sighted people. Metronaut, is a CMU’s campus visitor assistant that uses a bar code reader to infer its position information from a series of bar code labels placed at strategic locations around the campus [Smailagic et al. 1997]. Similar kinds of systems are developed by researchers in which current position of the user is used to overlay textual annotation and relevant information from their web servers to coincide with the image captured through their head mounted display [Feiner et al. 1997, Thomas et al. 1998]. Smart Sight, the tourist assistant developed by Yang et al (1999); is a slight variation to this approach. It gives the user a multi-modal interface, which includes voice, handwriting and gesture recognition. Attempts are also made to use computer vision techniques for outdoor augmented reality applications. Behringer et al. (1999) have developed a system based on the use of horizon silhouettes from precise orientation of camera to the user’s view. This approach is more applicable to natural terrain environments [Behringer et al. 2000]. A Similar approach is used for registration in an urban environment with the exception that the line of sight is registered by comparing the video frame or digital image with a 3D virtual GIS model [Chen et al. 1999, Coors et al. 2000]. The main breakthrough of these augmented reality applications is that it covers a larger footprint. Indoor navigation systems are also developed using similar concepts. Since GPS does not work inside a building, most systems rely on relative positioning using sensors such as active badges, digital tags, accelerometer, temperature, photodiodes and beacons [Ertan et al. 1998, Golding et al. 1999, Long et al. 1996, Randell et al. 2000]. The People Sensor, an electronic travel aid for the visually impaired, uses pyroelectric and ultrasound sensor to locate and

PAGE 20

11 differentiate between animate and inanimate obstruction in the detection path [Ram et al. 1998]. Image sequence matching methods are also used to give real-time positioning. This approach involves a set of trained images for each location. The image captured from the head mounted camera is compared against the trained set to give user context and location information [Aoki et al. 1999]. Computer vision techniques show promise for real time positioning and tracking. However, in practice there are many constraints that must be satisfied for such systems to work. Image registration techniques work only in perfect environmental conditions. Most related work reviewed so far deals in situations with the user having an unobstructed view of the target object or scene. The right amount of lighting, and ambient weather are a prerequisite for such techniques to work in the outdoors. In indoor navigation using such systems, very few have addressed issues such as what happens when a light bulb is burned out, the intensity of bulb goes down or other people sharing the corridor partially obstructing the user’s views. Most of them are standalone systems putting substantial amount of load over the network. Generating image training sets for each location of the building and various conditions is a monumental task by itself. In applications using registration with virtual 3D models, accurate positioning and camera orientation is required, which is difficult to achieve. Also, it is observed that sensor-based approaches are more ideal for indoor navigation than for outdoor. One major limitation in all systems discussed so far is the lack of dynamic query capabilities and support for a dynamically changing environment. Also, context awareness is not well supported. The term context awareness as applied to mobile location services is related to delivering temporal and location-specific information like

PAGE 21

12 building names, and traffic description within the vicinity, general weather reports for the area etc. [Bowskill et al. 1999]. In my view, contextual awareness is very important in navigating visually impaired people. It provides the immersive experience needed to augment the "blank" reality of these users. The main focus of this thesis is the development of a prototype that navigates visually impaired and disabled people in dynamic outdoor environment, while providing contextual awareness during the navigation process.

PAGE 22

CHAPTER 3 METHODOLOGY This chapter will present the problem domain, system design and system architecture. Problem Domain When people walk from one place to another, they make use of several different inputs. When a visually impaired person walks from one building to another on campus, he would lack many useful inputs. The goal is to develop a system to augment a visually impaired person’s pedestrian experience with enough contextual information to make them feel comfortable on a walk from one location to another. Figure 3-1. Campus study area. 13

PAGE 23

14 Proving Ground A map of the study area of the University of Florida (UF) campus is shown in Figure 3-1. For brevity and clarity only major layers of our GIS database and only a portion of the study area are shown. The study area covers about one fourth of the actual campus. To evaluate the efficiency of the prototype, it was made sure to select an area to include various scenarios such as crowded walkways, close buildings, services etc. Requirements and Functionality The prototype was designed based on Dr. Theral Moore’s input and reviewing literature on user requirements for blind. One of the key requirements is to deliver the information along the blind person’s path in real time through auditory cues. For example, a visually impaired individual might like to be standing anywhere on campus and ask where is the mathematics department. With the user’s current position a vocal map highlighting a route to get to the appropriate building should be presented. As the person travels along this route they could be aided in their quest. Further, such a system should be capable of generating routes preferable to the user. Most route selection algorithms provide the shortest path. Often for a blind user the shortest path to the destination is not the best route. The best route is the one with the least hazards. To illustrate further, in Figure 3-2, the best path generated by our prototype Route Server. The blind person is required to commute a segment of the path almost twice to avoid a crossover of a busy street and bike lanes. This may not be a major issue with the disabled person. The user should also have the preference of avoiding congested walkways at a particular time of the day.

PAGE 24

15 Figure 3-2. Path generated by route server based on user's preference. Figure 3-3 shows a sample voice prompt summarizing the journey. Such prompts should include landmark information and warnings about potential hazards in their way. The system will warn the user when approaching a building whether a ramp or stairs need to be traversed to enter the building. Starting from Computer Science Turn left on to Hub Walkway 2 Travel on Hub Walkway 2 for 79 feet Turn left on to Stadium Road Walkway Travel on Stadium Walkway for 225 feet Turn left into stop #2 Starting from stop #2 Turn right on to Stadium Road Walkway Travel on Stadium Rd. Walkway for 225 feet Continue straight onto Hub Walkway Travel on Hub Walkway for 81 feet Turn left onto Black Hall Walkway Travel on Black Hall Walkway for 111 feet Turn right into Mathematics Figure 3-3. Visual display of auditory cue

PAGE 25

16 In addition to generating routes, the users need to be guided along the path within the walkway width. The path buffered to the walkway’s width is as shown in Figure 3-4. There will be situations when the user changes his mind and needs to be re–routed. In such cases the system must be capable of taking user’s current location and re-route to his new destination. A similar scenario is depicted in Figure 3-5 and 3-6. After a blockade for a special event, the user is re-routed to the same destination through an alternate path. Figure 3-4. Path buffered to walkway width In some situations a user might prefer to add notes about certain conditions encountered. For example, the walkway has a pothole. This information should be used to warn the user appropriately next time when he uses the same path or the concerned authorities should fix the problem. In the same fashion the blind user when lost or has a sudden emergency need to be able to contact the police and/or paramedics.

PAGE 26

17 Figure 3-5. Simple path without any obstacles Figure 3-6. Path re-routed after a special events blockade System Design In designing the prototype we made use of Commercial-Off-The-Shelf (COTS) hardware and software. This helped us in focusing on the functionality of the system.

PAGE 27

18 Hardware Components Our prototype weighs approximately 8 lbs, which we believe most blind and disabled persons can carry. The backpack is designed in a way to distribute the load evenly. Figure 3-7. Wearable mobile client Figure 3-7 depicts a user (Steve) using the Drishti prototype on a test run. The wearable computer along with the GPS receiver is placed in the backpack. The user wears the head mounted display for visual tracking (disabled) and the integrated headset for speech I/O (blind). Wearable Computer Our wearable computer, which is shown in Figure 3-7, is a Xybernaut MA IV with a Pentium 200MHz processor, 64 MB main memory, 512 KB L2 cache, 1MB Video RAM, 2.12GB hard drive, 2 PCMCIA slots, 1USB port, Full duplex sound card, VGA Head mounted display, ANC 600 Pro monaural headset, two button builtin pointing device,

PAGE 28

19 MS Windows 98 operating system, and weighs 1.75 lbs. The built-in mouse option for tracking on display is retained for disabled client. A dual serial I/O PCMCIA card provides two serial ports. Differential GPS (DGPS) Receivers We are using Trimble PROXRS, 12 channel integrated GPS/Beacon/Satellite receiver with multi-path rejection technology. It has a rated horizontal differential correction accuracy of 50 cm + 1ppm on a second-by-second basis. These values are applicable for environments with clear view of sky. It has RTCM SC standard format input for real time differential correction and NMEA-0183 output. The system including the antenna weighs approximately 2 lbs. Wireless Network We used several wireless network connections through the different test phases of this project. We used the 802.11b wireless LAN technology providing 11Mbps bandwidth. The advantage of 802.11b is that it is easy to port and setup for demo purposes (e.g. in conferences and similar meetings). We have used 9600 bps GSM circuit switched connections requiring an Ericsson phone and a PC-card. Of late, we acquired iDEN wireless packet data phones from Motorola offering 40 Kbps raw data rates. The phone connects to the wearable via the serial port. Eventually, a campus-wide, outdoor wireless network utilizing the 802.11b technology will be the network of choice for the users of our prototype. Omni antennas connected to 802.11b access points are currently being deployed in the UF campus (see Figure 3-7, upper-left corner).

PAGE 29

20 Software Components Spatial database Environmental Systems Research Institute’s (ESRI) ArcSDE, is a spatial database engine, which acts as a gateway to manage GIS datasets on a Relational Database Management Systems (RDBMS). In our case, we have used Oracle 8i standard edition object RDBMS for Sun Solaris 2.8. Until recently most GIS systems were file based systems with limited multi-user capabilities. ArcSDE has its own Java API, which came in handy for our development purposes. Route server We are using ESRI’s ArcView Network analyst extension to generate routes based on least cost path. ArcView is a desktop GIS application that allows one to view, edit, and analyze GIS data sets. Network analyst runs within the process space of ArcView and has an API that is accessible through an ArcView scripting language called Avenue. Finally to build a route server that is accessible to our application we are running ArcView as a Remote Procedure Call (RPC) server. Vocal user interface IBM’s ViaVoice runtime and their implementation of the Java Speech API are being employed to provide a spoken dialogue user interface. This involves text-to-speech synthesis, voice-recognition and the design of a dynamic recognition grammar. Table 3-1 shows a sample spoken dialogue interaction between the system and the user, Theral in this case.

PAGE 30

21 Table 3-1. User browses list of available destinations and then requests a route Theral > “where can I go” Drishti > “known buildings are,” “Little,” “Music,” “Tigert” Theral > “more” Drishti > “Computer Science Engineering,” “Matherly” Theral > “departments” Drishti > “known departments are,” “Mathematics,” “Journalism” Theral > “more” Drishti > “Computer Science,” “Forestry,” “end of list” Theral > “Stop” Drishti > “ok” Theral> “route” Drishti > “from” Theral > “Mathematics” Drishti > “did you say Mathematics” Theral > “yes” Drishti > “to” Theral > “Computer Science” Drishti > “did you say Computer Science” Theral > “yes” Drishti > “ok, and away we go” GIS Database The GIS dataset for the campus was obtained from UF’s physical plant division (UFPPD). The architecture and engineering department of UFPPD provides services for renovation, remodeling work and new construction. Scale of the dataset is a critical factor in blind navigation. Datasets with detailed information are required. If possible, a scale of 1:1 is the most appropriate one. The scale of the datasets was 1: 1200, which is very good for most planning applications. In any GIS related project about 80 percent of the time is dedicated to data collection. So at the present time we are evaluating the UFPPD datasets and if deemed appropriate a new set will be created. The dataset included building location, streets, parking, walkways, and building plans. These layers became the base layers for our prototype. Other layers, which are vital to the project, to name a

PAGE 31

22 few, such as location of trees, fire hydrants, utility poles, bike racks, steps, traffic lights etc were mapped using DGPS. We restrict navigation to be confined within walkways. For this purpose the centerline of walkways were digitized to create a new layer. This layer is our network layer used by the Route Server to generate travel paths. The datasets were ground truth for both descriptive and spatial accuracy. It was found that the GIS layers had a systematic error of 2m. We are accounting for the error while determining the user’s current location. System Architecture Figure 3-8 describes the client/proxy/server architecture of our prototype. The server manages incoming requests from mobile clients, through a server-side proxy known as the client manager. Voice Out Voice In Java COMM API: DGPS Via Voice MOBILE CLIENT Physical Plant Traffic Police Route Server Server Side Proxy Spatial Database Engine ORDBMS UNIX WIN 98 W E ARABLE Figure 3-8. Client/proxy/server architecture

PAGE 32

23 Each client manager acts as a gateway between the client it supports and the GIS database. The server and client manager were developed using Java. The proxy shields the mobile clients from the details and software requirements of server. This was important to keep the mobile client simple and to control its footprint. Client managers make spatial queries and add new spatial information via ESRI's spatial database engine's Java API. The GIS database is exposed to various campus departments such as the University Police, Physical Plant and Special Events, to provide them with the ability to insert and remove dynamic obstacles, and monitor the campus for inputs added by individual users for their veracity. Communication Communication is essentially requests by the client followed by replies from a dedicated server monitor called the client manager, the mode of communication is asynchronous connection oriented. Both requests and replies are in the form of serialized objects. Figure 3-9. Shows how communication channels are set up between the mobile client and client manger thread. Each client, when it wishes to connect to the server, will send its request to be monitored object to the main server at its well-known address. The main server then registers the client, starts a client manger thread, and passes off the clients request object to it. The new client manger thread sends back its address to the registering client letting it know that it is registered. Finally the client responds to its new manager to finalize the deal. We are using UDP sockets to take advantage of its low overhead and to stay away from the well-known problems of TCP/IP in the wireless environment [Caceres et al; 1996]. Also, since navigation support is essentially a real-time application we need the quickness of UDP over TCP/IP.

PAGE 33

24 Figure 3-9. Negotiation to set up a connection for a client. Since there are two types of information flowing between the client and its proxy, one time sensitive and the other not, each client has two listeners using data gram sockets to listen on. One listener called the InfoListener is for incoming information objects such as a list of destinations on campus, or information about the building the client is passing by at the moment. This type of information arrives due to explicit requests for it. The other listener, called the DirectionsListener, listens for navigation prompts presented by the client’s manager. This type of information arrives due to implicit requests by the client to the client manager in the form of location coordinates that continuously stream from the client to the client manager. Each client manager has one listener called the ClientListener that listens for requests from the client. Information queries are acknowledged whereas navigation queries are

PAGE 34

25 not. The idea here is to separate the requests that are less time sensitive but need to get through from the requests that are time sensitive but do not need to always get through. When we say need to get through, we only mean that one would prefer to have to ask for the information once, but if necessary could repeat the request. Navigation queries are just a current location object and we do not need to put effort into making sure this type of information gets through as current location quickly becomes stale as we keep moving. The protocol for sending information queries is for the client to send an InformationRequest object to the client manager, the client manger will send back an acknowledgment, if the client does not receive the acknowledgement within a specified time interval, it will resend the request at most N times for some fixed N. Each information request has a unique id so that if more that one copy gets through, it will only be processed once. The client manager will process the request and send back results to the client’s information listener. The protocol for sending navigation or location queries is for the client to send a GPS coordinate to the client manager. Each GPS coordinate is ordered so that older coordinates are dropped and are not processed. The client manager will process the request and send back results to the client’s directions listener. Client manager/proxy The interface to the GIS database is done via the Java SDE API. The API allows us to make spatial queries on the feature geometries stored in the database and standard SQL queries on attributes associated with spatial features. Figure 3-10, shows the architecture of the client manger. The Client Listener accepts two kinds of requests, information requests and navigation prompt requests. There are many types of information requests, as they arrive they are put

PAGE 35

26 UDP UDP UDP put Method invocation RPC TCP/IP Route Server take FIFO Queues take put take put put put take Route Navigation Manager SDE Information Request Manager Directions sender Client listener Information sender Figure 3-10. Client manager architecture into a queue that is monitored by the Information Request Manager (IRM). The IRM is a thread that will continually attempt to take requests from its queue, if there are requests in the queue, the first one in will be removed and processed, and the result will be placed into a queue that is monitored in a similar fashion by a thread called the Information Sender. If the IRM finds no requests in its queue, it will wait until one arrives. Once a new request arrives in the queue, the queue will notify the IRM that a request has arrived, the IRM will then proceed. This is done using Java’s wait and notify object monitors.

PAGE 36

27 Navigation prompt requests arrive in the form of a GPS coordinate. When GPS coordinates arrive, the Client Listener places them in the Navigation Managers (NM) queue. The NM is similar to the IRM in that it is also a thread waiting for GPS coordinates to arrive in its queue. The NM has an instance of a Route object that contains the route the client is currently on. Using method invocation, the NM asks the Route for prompt information. Prompt information contains the users relation to the route. It is then put into the Directions Sender’s queue to be sent back to the client. There are a few requests that can not be satisfied using the SDE API, it can not generate routes and it can not give us the users relation to the route other than their distance from it. Our database contains a layer representing the network of campus sidewalk center lines. ESRI, the company that developed SDE has a GIS desktop software called ArcView. It has a component called Network Analyst that can generate least cost routes through a network. It also has a scripting language called Avenue with which one can customize. Also, it can be run as a Remote Procedure Call Server. However, it will only accept an ASCII string as an input argument and only return an ASCII string to the RPC client, which invoked it. The following explains how we used ArcView to generate routes. First we start an RPC server process. When a route request arrives it is passed to the IRM, the IRM then calls an SDE C program that converts the sidewalk center line layer in the database to a file format readable by ArcView, the type of file is called a shape file. This is done using the Java Native Interface (JNI). The SDE C program in question is sde2shp. Now we have our network of center lines in a format that ArcView can work with. Next we need

PAGE 37

28 to call the ArcView RPC server, since Java does not do RPC, we wrote a RPC client in C and used JNI to write a Java class that will communicate with the RPC client. The RPC server will accept only an ASCII string. The string contains the name of an Avenue script to run along with the name and path of the file that contains the sidewalk network. It also contains a start point and an end point for the route. ArcView then executes the script, which generates a least cost path from the start point to the end point. Now ArcView can only return a string to our RPC client. So to be able to have the route returned, we generated a sequence of id’s for each sidewalk segment in the database. ArcView returns a string of sidewalk id’s to our RPC client, or nil if no route exists. If nil is returned, the IRM returns no route available to the client. If id’s are returned, the IRM selects these sidewalk segments from the database and builds a navigable route. This route is then returned to the client and also put into the NM. In Figure 3-10, the NM is represented with dotted lines, this is to signify that it might not be in the Client Manager. We have designed the system so that the NM can exist either on the server side where computing resources are greater or on the client itself where access to it is much faster. Client The client has many components that make up the whole, as can be seen in Figure 3-11. I will now discuss the various component interactions and information flow. Client control This is the main unit of the client. All the threads are started here with references to each thread maintained. The Information Listener also passes routes to the Navigation Manager through here.

PAGE 38

29 p ut take Path Viewer take p ut Client Control DGPS Listener Vocal Listener Navigation Manager Recognizer Synthesizer Vocal View Grammar Manager UDP UDP UDP Directions Listener Information Listener Sender voice voice Figure 3-11: Client components Sender The Sender waits for objects to arrive in the queue, as they do the Sender removes them, serializes them into a datagram packet and sends them to the Client Manager. If the sent object is a information request object, the Sender waits for an acknowledgement from the Client Manager, if none is received it will resend the request. The Sender will attempt to send the request N times for any fixed N we choose. If the sent object is a current location coordinate no acknowledgement is required as current location is continuously changing and updated frequently.

PAGE 39

30 Information listener All the information requested by the client other than navigation prompts comes in on this channel, a Datagram Socket. Information that arrives at the client, which will be eventually spoken to the user, is passed to the Vocal View via a method call. Grammar information also arrives on this channel; it is passed off to the Grammar Manager again by way of method call. Another important information that is sent to the Information Listener is Route. It is passed to the Path Viewer for ocular viewing, to the Client Control for later verbal viewing and if the Navigation Manager is on the client side, the Route is also passed to it. Directions listener The Directions Listener waits for navigational prompts to arrive either on a datagram socket (Server Navigation Mode) or in a queue (Client Navigation Mode). Navigational prompts are passed to the Vocal View to be spoken to the user. DGPS listener The DGPS Listener is a thread that implements and has registered as a Serial Port event listener, which is part of the Java COMM API. The Differential GPS receiver is connected to serial port 2 on the wearable and is configured to output a NMEA 0183 protocol message every second. The NMEA message that is output is called the GGA sentence; it is an ASCII string that contains Global Positioning Fix Data. The format of the message is: $GPGGA,hhmmss,xxxx.xx,a,yyyyy.yy,a,x,xx,x.x,x.x,M,x.x,M,x.x,xxxx*hh Meaning of each field in the $GPGGA message is given in Table 3-2.

PAGE 40

31 Table 3-2: GGA sentence field definitions hhmmss The UTC of the position, in hours, minutes, and seconds xxxx.xx, a The latitude, North/South yyyyy.yy, a The longitude, East/West x A GPS quality indicator: 0 = fix not available or invalid 1 = GPS fix 2 = differential GPS fix xx The number of satellites in use, from 00 to 12. x.x The Horizontal Dilution of Precision x.x The antenna altitude above or below MSL M The units of antenna altitude in meters x.x The geoidal separation, the difference between the WGS-84 earth ellipsoid and MSL M The units of geoidal separation in meters x.x The age of the differential GPS data in seconds xxxx The differential reference station ID, from 0000 to 1023 An example of a GGA sentence is: $GPGGA,120757,5152.985,N,00205.733,W,1,06,2.5,121.9,M,49.4,M,,*52 We are interested in latitude, longitude and fix quality. The DGPS listener is notified of serial events. For each data available, event is passed a byte stream. The byte stream is parsed and once the listener has received a (latitude, longitude, fix quality) tuple, it creates a new current location object and sends it to the Navigation Manager via the Sender, if the Navigation Manager is on the server side. It receives via a queue, if the Navigation Manager is local on the client. Each GPS coordinate will be displayed in the Path Viewer. However, GPS coordinate information is getting updated every second, we cannot provide navigation prompts for each one of them. The DGPS Listener has a method that can be called to mark the next location fix to be spoken to the user.

PAGE 41

32 Navigation prompt will be generated and returned to the Vocal View. To invoke an navigation prompt, one need to say “location” to Drishti. Vocal listener The Vocal listener extends the class ResultAdapter, which is part of the javax.speech.recognition package. The package javax.speech is IBM’s implementation of the Java Speech API. The Vocal Listener overrides the ResultAccepted method to receive recognition matches. The recognition process goes accordingly, sound waves are captured by the microphone and converted into electrical impulses. The sound card converts the acoustical signal to a digital signal. The recognition engine converts the digital signal to phonemes. Then matches to words in Drishti’s rule grammar. Below is one of Drishti’s rule grammars called browse. grammar browse; public = stop {stop} | quit {stop} | done {stop} | finished {stop}; public = next {next} | forward {next} | previous {back} | more {more} | first {top} | top {top} | last {bottom} | bottom {bottom}; The browse grammar is active when a user is browsing lists such as a places to go, or a route itinerary. If a user says “done” then the rule is matched and the recognizer generates a resultAccepted event. The Vocal Listener is passed the event. From this event it can get the rule that was matched, and also the tag that was matched, {stop}. This would indicate to the Vocal Listener that the user would like to stop browsing the list. The browse grammar would then be deactivated and the general Drishti grammar

PAGE 42

33 would be activated. This feature of dynamically changing grammars allows the recognizer to constrain which rules to match. It enhances the accuracy of recognizing what the user said. For another example from the browse grammar, if the user says “more” the rule is matched and also the tag {more} this lets Drishti know to speak the next three items in the list. However, if the user says “forward” the same rule is matched but this time the tag is {next}. Drishti speaks only the next item in the list. Another feature of the JSAPI that the Vocal Listener employs is loading grammars dynamically. For example, when a user wants a route, he says “route”, Drishti will then prompt the user “from” and then listen. So that Drishti can understand place names on campus, which can change from time to time, Drishti downloads place names from the GIS database and writes them in JSGF to a StringBuffer. The StringBuffer can then be loaded as a grammar and activated. Now Drishti will understand “Weil”. Grammar manager The Grammar Manager receives information from the database to keep grammars up to date and create new grammars. Vocal view The Vocal View has three methods, say(String), sayJSML(JSMLString), and ClearSpeech(). JSML stands for Java Speech Markup Language, it aids in improving the quality and naturalness of synthesized voice. These methods just pass off to similar methods in Synthesizer. For example VocalView.say(“happy trails”) causes Drishti to say “happy trails”. Also VocalView.say(.5”) causes Drishti to say dot 5”, thus to speak numbers correctly we use

PAGE 43

34 VocalView.sayJSML(“23.5”) which causes Drishti to say point 5”. When we call one of the say methods, the synthesizer places the string argument in a FIFO queue to be spoken. Sometimes Drishti will become overwhelmed with information to be spoken, falling behind and continue to speak stale information. In such cases you can say “stifle” and the ClearSpeech() method will be called to cause the speech queue to be emptied. Path Viewer The path viewer is built using Java2D classes and allows us to view routes, current location and surrounding features. Navigation During navigation the following information was required to be available. One’s distance from the path, in our case the centerline of the sidewalk. One’s relationship to the path is it to our left or right or are we on it. How far is it until I have to make a turn and how much should I turn and what direction. As we discussed earlier, ArcSDE does not store topology of feature geometries, thus it cannot generate routes. We overcame this problem by customizing ArcView’s Network Analyst as a route server running as an RPC server. Again here we have requirements that cannot be satisfied using the ArcSDE Java API. Using the API we can get our distance from a path, but we cannot determine our relationship to the path, is it to our left or right, how far until a turn, which way to turn, how much to turn? So we used Java2D to build our own Route object, it is called DUPath for Drishti Unified Path. Unified because it is a series of connected coordinates. Also, when we

PAGE 44

35 extracted the route from the database, it came out in segments that were not in order and we had to put them together in an order or unify them. Figure 3-12 shows a DUPath current location d Figure 3-12. Drishti unified path A DUPath contains the following information. A sequence of (x,y) coordinates, a sequence of turn angles, the length of each segment, the slope of each segment, the real world width of each segment, and the database Feature ID (or primary key) of each segment. By retaining the FID we do not loose our link from the segment to any other stored information. With the width of each segment we can let the user know if they are on the path or not. The turn angles at each turn are pre calculated to cut down on the processing that needs to be done during real time navigation. They are also used to build an itinerary along with the length of each line segment. The slope of each line segment (horizontal plane x, y) is used in calculating the users perpendicular distance from the

PAGE 45

36 center of the path. All the relevant slopes for each of the individual segments are pre calculated and stored. This will reduce the number instructions to be performed during real time navigation

PAGE 46

CHAPTER 4 ASSESSMENT Challenges As with any system integration project there were many great rivers to cross to arrive at a working prototype. This section reviews some of the main challenges encountered and how we met them. GIS Data collection is a time intensive task. Finding the right data sets to use took a few months. After the data sets where obtained they had to be ground truthed for correctness and accuracy. For example, did the data match what was actually on campus, did a sidewalk still exist and if it did were its coordinates in space correct. Ground truthing took a few more months; here is a niche where Drishti can come in handy by giving us access to the model of space while we are immersed in the actual space. The GIS layers we obtained had the outlines of the sidewalks on campus but lacked topology. Since there was discontinuity in the sidewalk feature, we had to digitize the centerlines of sidewalks covering a subset of the campus. For the rich navigation queries we wanted to make, our GIS layers needed to have topology, thus we developed our own set of spatial objects to use for navigation. On the upside they are only dependent on Java2D thus they can be used on the wearable without having to install a full GIS software package. We had to project our DGPS geographic coordinates into our spatial database’s state plane coordinate system (SPCS 83). 37

PAGE 47

38 COTS In using many different systems we ended up with a Tower of Babel. We stayed with Java when possible, but also wrote code in C and in ArcView’s object oriented scripting language Avenue. We interfaced the C code with the Java using JNI. The Avenue code was interfaced using RPC. The other interface we had to deal with was reading from a serial port the National Marine Electronic Association (NMEA) 0183 protocol. This was done using the Java Communications API package, javax.comm. Speech Interface Designing a good command and control speech interface is extremely important for a system like Drishti. The goal is to keep your rule grammar constrained to give better recognition accuracy. One should design in such a way that your interface is divided into different input states and have a grammar for each state. Any wrong noise in the background and the interface will change to some unknown state and the user will not know what to say. It is a good idea to have one universal command such as “escape” that will put your interface in a known state. It is also a good idea to include a command to get your interface to stop speaking, in case you experience information overrun. Drishti has a command “stifle” to stop all queued speech. Issues There are a few issues with the wearable. When the headset is plugged into the audio out on the head mounted display, the earphone volume is very low. The audio is fine when the earphone is plugged into the audio out on the port replicator. Intense wireless data transfer using the 802.11b PCMCIA card steps on the voice stream. Drishti’s voice is degraded with a garble when GPS coordinates are sent over the wireless network.

PAGE 48

39 Navigation works better on client side cache. The time it takes for the GPS coordinates to reach the server, convert into navigation prompt and then back to the client was slow for a normal pedestrian pace. However, the coordinate information is sent to the client proxy monitor on the server side for tracking. Assessment Summary Drishti is a working model of a navigation system for the visually impaired that has met primary goals of wearable computing, location awareness, speech interface, and wireless access to a centralized spatial database in a dynamic environment. Table 4-1 shows a comparison between Drishti and related work in the area of blind navigation systems. Table 4-1: Drishti functionality compared to related work Year Group Location Speech Wearable Wireless Dynamic Environments 1991 Golledge 1996 Petrie Yes Yes 1998 Ram Yes 1998 Golledge Yes Yes Yes 2002 Drishti Yes Yes Yes Yes Yes

PAGE 49

CHAPTER 5 CONCLUSIONS AND RECOMMENDATIONS Conclusions Drishti, meaning Vision in the ancient Indian language Sanskrit, a wireless pedestrian navigation system for the visually impaired and disabled is developed. Most systems that have been developed so far lack the kind of dynamic interaction and adaptability to changes that our system provides to the user. The approach also emphasizes contextual awareness that we believe is very important to enhancing the navigational experience, especially for the blind user. Recommendations Drishti is a prototype that has brought together many technologies to augment users with spatial and contextual awareness. However, much effort went into integrating the technologies and thus the components were not fully optimized. The focus in the future should be on ergonomic design of individual components and ramp them up to a higher grade of performance. Currently our GIS database has building plans registered with the rest of the layers. This facilitates smooth outdoor/indoor navigational handoffs. Further these building plans have extensive information like fire exits, seat arrangement in a classroom, elevator location, stairs with number of steps etc., based on strict building codes. Some form of relative Indoor positioning technique need to be developed. GPS is not a foolproof system; we loose track of signals near tall buildings and under tree canopies. At present we compensate such loss with dead reckoning techniques using 40

PAGE 50

41 magnetic compass, user's average travel speed and rules specified in the GIS database. Using a Kalman filter could enhance the navigation component during dead reckoning phase. In the future, we plan to incorporate GPS and GLObal NAvigation Satellite System (GLONASS Russian equivalent to GPS) receivers to take advantage of expanded satellite coverage. Studies in urban environment have shown very good position fix density by incorporating GLONASS [Walsh et al; 1998]. Split client/proxy communication into two channels, one channel for location information, which is continually updated and can afford to be lost and the other channel for information which is not repeatedly sent. Implementing such an approach will save the time it takes to check incoming serialized object to see if they require acknowledgement. Designing the client/proxy as a discoverable service should be explored. Explore using Drishti as a tool to add much more information to spatial database to enhance contextual awareness. The boot up process of the wearable is still not vision free, so this needs to be worked on. Better caching strategies should be implemented so that Drishti can access more information when disconnected.

PAGE 51

LIST OF REFERENCES Aoki, H., B. Schiele and A. Pentland, “Realtime Personal Positioning System for a Wearable Computers,” The Third International Symposium on Wearable Computers, San Francisco California, October 18-19, 1999, pp. 37-43. Behringer, R., “Registration for Outdoor Augmented Reality Applications Using Computer Vision Techniques and Hybrid Sensors,” in Proceedings of Virtual Reality , Houston Texas, March 13-17, 1999, pp. 244-251. Behringer, R., C. Tam, J. McGee, S. Sundareswaran, and M. Vassiliou, “A Wearable Augmented Reality Testbed for Navigation and Control, Built Solely with Commercial-Off-The-Shelf (COTS) Hardware,” in Proceedings of IEEE and ACM International Symposium on Augmented Reality, Munich Germany, October 5-6, 2000, pp. 12-19. Bowskill, J., M. Billinghurst, B. Crabtree, N. Dyer and A. Loffler, “Wearable Location Mediated Telecommunications; A First Step Towards Contextual Communication,” The Third International Symposium on Wearable Computers, San Francisco, California, October 18-19, 1999, pp. 159-166. Caceres, R. and L. Iftode, "Improving the Performance of Reliable Transport Protocols in Mobile Computing Environments," Mobile Computing, T. Imelinski and H. Korth, eds. Kluwer Academic Publishers, Boston, 1996. pp. 207-228 Chen, T. and R. Shibasaki, “A Versatile AR Type 3D Mobile GIS Based on Image Navigation Technology,” IEEE International Conference on Systems, Man and Cybernetics, Tokyo, Japan, October 12-15, 1999, pp. 1070 -1075. Coors, V., T. Huch and U. Kretschmer, “Matching buildings: Pose estimation in an urban environment,” in Proceedings. IEEE and ACM International Symposium on Augmented Reality, Munich Germany, October 5-6, 2000, pp. 89-92. Ertan, S., C. Lee, A. Willets, H. Tan and A. Pentland, “A Wearable Haptic Navigation Guidance System,” The Second International Symposium on Wearable Computers, Pittsburgh, Pennsylvania, October 19-20, 1998, pp. 164-165. 42

PAGE 52

43 Feiner, S., B. MacIntyre, T. Hollerer and A. Webster, “A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment,” The First International Symposium on Wearable computers, Boston, Massachusetts, October 13-14, 1997, pp. 7481. Golding, A. and N Lesh, “Indoor Navigation Using a Diverse Set of Cheap, Wearable Sensors,” The Third International Symposium on Wearable Computers, San Francisco, California, October 18-19, 1999, pp. 2936. Golledge, R., R. Klatzky, J. Loomis, J. Spiegle and J. Tietz, “A Geographical Information System for a GPS Based Personal Guidance System,” International Journal of Geographical Information Science, 1998, 12(7): pp. 727-749. Golledge, R., J. Loomis, R. Klatzky, A. Flury and X. Yang, “Designing a Personal Guidance System to Aid Navigation Without Sight: Progress on the GIS Component,” International Journal of Geographical Information Systems, 1991, 5(4): pp. 373395. Long, S., D. Aust, G. Abowd and C. Atkeson, “Cyberguide: Prototyping Context-Aware Mobile Applications,” in Proceedings, Conference on Human Factors in Computing , Vancouver, British Columbia, April 13-18, 1996, pp. 293-294. Mann, S. and H. Niedzviecki, Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer, Doubleday Canada, Canada, 2001. 304p. Petrie, H., V. Johnson, T. Strothotte, A. Raab, S. Fritz and R. Michel, “MOBIC: Designing a Travel Aid for Blind and Elderly People,” Journal of Navigation, Royal Institute of Navigation, London, 1996, 49(1): pp. 45-52. Ram, S. and J. Sharf, “The People Sensor: A Mobility Aid for the Visually Impaired,” The Second International Symposium on Wearable Computers, Pittsburgh, Pennsylvania, October 19-20, 1998, pp. 166-167. Randell, C. and H. Muller, “Context Awareness by Analyzing Accelerometer Data,” The Fourth International Symposium on Wearable Computers, Atlanta, Georgia, October 18-21, 2000, pp. 175-176. Smailagic, A. and R. Martin, “Metronaut: A Wearable Computer with Sensing and Global Communication Capabilities,” The First International Symposium on Wearable Computers, Boston Massachusetts, October 13-14, 1997, pp. 116-122.

PAGE 53

44 Sutherland, I. “A Head-Mounted Three Dimensional Display,” in Proceedings. FJCC, Thompson Books, Washington, DC, 1968 pp. 757-764. Thomas, B., V. Demczuk, W. Piekarski, D. Hepworth and B. Gunther, “A Wearable Computer System with Augmented Reality to Support Terrestrial Navigation,” The Second International Symposium on Wearable Computers, Pittsburgh, Pennsylvania, October 19-20, 1998, pp. 168-171. Van Sickle, J., GPS for Land Surveyors, Ann Arbor Press, Chelsea, Michigan, 1996, 209p. Walsh, D., S. Capaccio, D. Lowe, P. Daly, P. Shardlow and G. Johnston, “Real Time Differential GPS and GLONASS Vehicle Positioning In Urban Areas,” IEEE Conference on Intelligent Transportation System, Boston, 1997, pp. 514-519. Worboys, M., GIS: a computing perspective, Taylor & Francis, London. 1995, 376p. Yang, J., W. Yang, M. Denecke and A. Waibel, “Smart Sight: A Tourist Assistant System,” The Third International Symposium on Wearable Computers, San Francisco California, October 18-19, 1999, pp. 73-78. Zeiler, M., Modeling Our World, The ESRI Guide to Geodatabase Design, ESRI Press, Redlands, California. 1999, 199p.

PAGE 54

BIOGRAPHICAL SKETCH Steve Moore was born on May 27 th , 1963, in Gainesville, Florida; USA. He attended P.K. Yonge, the University of Florida’s developmental research school, from kindergarten through high school. After graduation from high school, Steve enrolled in the University of Florida where he studied architecture at first but then went on to study mathematics. To support his habit of taking classes at UF, Steve worked at Leonardo’s Pizza and as a Fortran consultant in the Department of Industrial and Systems Engineering. After earning a BA in mathematics from UF in 1987, he got married to Alicia Conrad (Bachelor of Science in Nursing, UF, 1988) and began work on a graduate degree in mathematics while also taking courses in computer science. He took time off from school to teach math at Santa Fe Community College as an adjunct instructor while also doing a lot of canoeing, camping, and biking with his family. In 1998 he returned to the University of Florida to pursue a masters degree in computer science and received his Master of Science degree in August 2002. His personal interests include his family, biking, canoeing, camping, and soccer to name a few. His current research interests include mobile computing, location aware computing and pervasive computing. 45