<%BANNER%>

Drishti, Integrated Indoor/Outdoor Navigation System and Service


PAGE 1

DRISHTI: AN INTEGRATED INDOOR/OUTDOOR NAVIGATION SYSTEM AND SERVICE By YINGCHUN (LISA) RAN A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2003

PAGE 2

Copyright 2003 by Yingchun (Lisa) Ran

PAGE 3

To my dear parents for their encouragement. To my dear husband Jeffery for all of his deep love, support and consideration. To my lovely baby son Samuel, who has made the great sacrifice of letting his mother stay in America to finish this research. His smile has made this thesis possible.

PAGE 4

iv ACKNOWLEDGMENTS I would like to thank Dr. Abdelsalam Helal sincerely for his great work, guidance and encouragement. I would like to express my deep thanks to Steve Moore for his wonderful work on the Drishti outdoor version and his generous help on the spatial database, code debugging and voice communication. I also thank Bryon Winkler for kindly answering my numerous questions about the indoor location system.

PAGE 5

v TABLE OF CONTENTS Page A C KNO W L E D GMEN T S ............................................................................................ ... iv L I ST O F F I GURES ........................................................................................................ vii A B STRAC T .................................................................................................................... .. ix C HA P TE R S 1 INTRODUCTION...........................................................................................................1 2 REVIEW OF RELATED TECHNOLOGIES.................................................................3 2.1 Ge o g raph i c I nfor m ation S y stem ( G I S ) ..................................................................... 3 2.1.1 Spatial Data........................................................................................................3 2.1.2 Spatial Data Models...........................................................................................4 2.1.3 Attribute Data.....................................................................................................5 2.1.4 ArcView.............................................................................................................5 2.2 Global Positioning S y s t em (GPS ) ............................................................................. 6 2.3 ArcSD E ..................................................................................................................... 8 2.4 ArcSDE J a va A P I ..................................................................................................... 9 2.5 He x a mite Ultrasound L o c a l Positioning S y st e m ...................................................... 9 2.6 Voice Rec o g nition a n d Speech S y nt h e si s ............................................................... 11 2.7 OS G I ....................................................................................................................... 12 2.8 W earable Computing .............................................................................................. 13 2.9 W i reless Communicatio n ........................................................................................ 14 3 OVERVIEW OF THE DRISHTI OUTDOOR NAVIGATION VERSION.................15 3.1 Review of Related W or k ......................................................................................... 15 3.1.1 Obstacles And Hazards Detecting...................................................................15 3.1.2 Location And Orientation................................................................................17 3.2 Outdoor Version of Drishti Navigation S y st e m ...................................................... 22 3.2.1 System Design.................................................................................................22 3.2.2 COTS Hardware And Software.......................................................................27 4 THE INTEGRATED INDOOR/OUTDOOR DRISHTI...............................................29 4.1 S y stem Archit e ctur e ................................................................................................ 29 4.2 I nt e r a ctions of Component s .................................................................................... 32

PAGE 6

vi 4.2.1 Client................................................................................................................32 4.2.2 ClientServer Proxy...........................................................................................36 5 LOCATION SERVER...................................................................................................41 5.1 He x a mite L o c a tion S y s t e m ..................................................................................... 41 5.1.1 Hardware Components.....................................................................................41 5.1.2 Hardware Configuration..................................................................................43 5.1.3 Distance String.................................................................................................46 5.2 OS G I L o cation Serv i c e ........................................................................................... 49 5.2.1 OSGI................................................................................................................49 5.2.2 Location Server with Indoor Location Service Bundle...................................52 6 SUMMARY AND FUTURE WORK...........................................................................55 6.1 Achievement and co n t ributio n ................................................................................ 55 6.2 F uture W or k ............................................................................................................ 56 REFERENCES.................................................................................................................57 BIOGRAPHICAL SKETCH............................................................................................60

PAGE 7

vii LIST OF FIGURES Figure Page 2-1 An Example of Layers in GIS..........................................................................................4 2-2 Different Layers in One View in ArcView......................................................................7 2-3 How GPS Works.............................................................................................................7 2-4 ArcSDE Architecture.......................................................................................................9 2-5 Six-Point Hexamite Local Positioning System................................................................11 3-1 Structure of Navigation System Using Wearable Sensors...............................................18 3-2 Location Guidance System, Hideo Makino etc...............................................................20 3-3 Client/Proxy/Server Architecture of Drishti....................................................................23 3-4 Mobile Client Components Interaction...........................................................................24 3-5 Sample Voice Prompt of a Route....................................................................................25 3-6 User Browses List of Available Destinations and Requests a Route..............................27 4-1 Wearable Mobile Client...................................................................................................29 4-2 Integrated Indoor/Outdoor/Client/Proxy/Location Server/Architecture..........................30 4-3 Clients Three Components: Vocal Interface, DGPS Receiver and Communicator........32 4-4 Client Manager Architecture ..........................................................................................36 4-5 Work Process of DrishtiIndoor........................................................................................38 4-6 Example of Geometric Calculation of Orientation..........................................................39 5-1 Hexamite Location System Devices................................................................................42 5-2 Hexamite Ultrasound Location System Coverage of the Smart House...........................43 5-3 Settings.txt for Hardware Configuration.........................................................................44

PAGE 8

viii 5-4 Location Calculation Trilateral........................................................................................47 5-5 Location Calculation Example........................................................................................48 5-6 Orientation and Position Analysis P.left.X>P.right.X and P.left.Y>P.right.Y................49 5-7 OSGI Architecture.......................................................................................................... .50 5-8 State Diagram Bundle......................................................................................................5 2 5-9 Location Service Scheme................................................................................................53

PAGE 9

ix Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science DRISHTI: AN INTEGRATED INDOOR/OUTDOOR NAVIGATION SYSTEM AND SERVICE By Yingchun (Lisa) Ran May 2003 Chair: Abdelsalam (Sumi) Helal Major Department: Computer and Information Science and Engineering Drishti is an integrated indoor/outdoor navigation system for visually-impaired people. It uses a precise position measurement system, a wireless connection, a wearable computer, and a vocal communication interface to guide users and help them travel independently and safely. In the outdoors, Drishti uses DGPS as its location system to keep the user as close as possible to the central line of sidewalks; it provides the user with an optimal route by means of its dynamic routing and rerouting ability. The user can switch the system from an outdoor to an indoor environment with a simple vocal command. An ultrasound location system called “Hexamite” is used to provide very precise indoor location measurements. The user can ask information about the room’s layout and the positioning of any furnishings. The user’s location is then compared to the spatial database of the “smart house” and the relationship between the user and the indoor facilities is computed. Drishti then gives travel prompts about possible obstacles to the

PAGE 10

x user to help him/her avoid injury. Drishti also provides the user with step-by-step walking guidance. The indoor service of Drishti is bundled under the OSGI framework to make it compatible with other services offered by smart houses, such as opening the door for a visitor, checking the weather using a phone, etc.

PAGE 11

1 CHAPTER 1 INTRODUCTION Statistics [1] indicate that there are approximately 10 to 11 million blind or visually impaired people in North America, and this number is growing at an alarming rate. As many of these people have difficulty knowing where they are or where they are going, frequently feeling totally disorientated or even isolated, navigational guidance is very important for them. Navigation involves updating one’s position and orientation while he or she is traveling an intended route, and in case the person becomes lost, reorienting and reestablishing a route to the destination. Guiding people is about giving them more information that usually includes obstacle prompting. Visually impaired people not only have a very limited reachable world but also depend on repetitive and predefined routes with minimum obstacles. At times these routes may be subject to change: a sidewalk may be blocked for roadwork, a fallen branch or temporary puddle of water after a heavy rain on the route may be dangerous to the people who could not see it. A guide dog or long cane may help detect the problem, but blind people need more information to find detours or rearrange routes. This thesis is based on the outdoor version of Drishti navigation system done by Steve Edwin Moore [2]. The outdoor version of Drishti uses DGPS to locate the user in an outdoor environment, answers the user’s various requests and gives information about routing and rerouting dynamically according to changes in the environment. In this thesis we extend the Drishti outdoor version to support indoor navigation. In an indoor environment, traveling is even more difficult because the space is relatively small and

PAGE 12

2 2 there are a lot of narrow hallways, stairs, doors and furniture, so visually impaired people may face closer obstacles. They may very likely stumble over obstacles. If they are new to the environment, it is very dangerous for them to walk alone. This system tells the user the layout of the indoor facility, and gives him/her a big picture of what the environment is like. The user may also get distance and navigation information between destinations. On the way he/she can ask the obstacle prompt to guarantee travel safety. The system can also communicate with the user and answer different requests. Because GPS is not available in indoor situations, and because the requirements of measurement error change, the Drishti system switches to a different positioning service called Hexamite for indoor use and prompts the user with the indoor room layout using the example of Smart House. Since the indoor space is smaller and more crowded than the outdoors, a high precision measurement scale is provided.

PAGE 13

3 CHAPTER 2 REVIEW OF RELATED TECHNOLOGIES Many mature and commercial technologies are used in this research to provide comprehensive (indoor/outdoor) navigational guidance. In the following, I briefly describe each of these technologies. 2.1 Geographic Information System (GIS) GIS is a complex computer system that incorporates technologies from a wide range of disciplines, including, but not limited to remote sensing, cartography, surveying, geodesy, photogrammetry, geography and computing. It combines layers of information about a place to provide a better understanding of it. With GIS, you may combine many layers of information according to your own purpose. The real power of a GIS is its ability to integrate various data layers and perform data analyses between the data layers. Figure 2-1 is an example of data layers used to describe a piece of land, including information about hydrology, soils, roads, elevation, land use, etc, each piece of information being one layer. 2.1.1 Spatial Data Once the layers are compiled, we can analyze the information they represent. The information on the layer map is spatial data. Spatial data contain the coordinates and identifying information for various map features. There are mainly three kinds of features: points, lines and polygons (areas). Buildings can be represented as polygons;

PAGE 14

4 road, railways and rivers are all lines, well and city in a marketing layer may be considered points. Figure 2-1 An Example of Layers in GIS 2.1.2 Spatial Data Models There are two kinds of spatial data models: raster and vector. The raster format uses an array of grid cells or pixels. Each grid cell is referenced by a row and column number and contains a number representing the type or value of the attribute being mapped. Each grid cell represents an area on the surface of the earth and the average value of whatever attribute is being considered for that particular place. Real world features are assumed to be present or absent from any given square. The smaller the square size is, the more accurate the representation of the real world feature is. There are no points, lines and polygons. In Vector GIS, we represent real world features abstractly as mathematical vectors located in a Cartesian (x, y, z) coordinate space. Vector technology uses a series of lines

PAGE 15

5 to define the boundary of the object of interest. In this thesis we are using vector model. Points, lines, polygons are vector based GIS. 2.1.3 Attribute Data Attribute data is another type of GIS data that is not on the layer map but that can be associated to the map through links to the spatial data. Say that a point representing a city in a marketing layer is spatial data, then the amount of coke sold in that city and the population of that city are attribute data. 2.1.4 ArcView ArcView is a powerful tool made by Environmental Systems Research Institute (ESRI) for the management, display, query, and analysis of spatial information. With the knowledge of spatial data, attribute data and GIS layers, we can easily build views, tables, charts, layouts, scripts and wrap them together into a project to represent the relationship between spatial data and attribute data. Following is the basic knowledge of ArcView. Project: A project is a file in which ArcView stores the user’s work. All related work can be wrapped in a single project, including tables, charts, spatial views of your data, map layouts, etc. When that project file is opened again, all its wrapped component parts will be ready to use again. Each project has a window. Project Window: The project window is the smaller window on the left of the initial ArcView window. The initial “untitled” name of the project will be changed to the name the user defined with an .apr extension. It lists all the components of the project

PAGE 16

6 in the order of views, tables, charts, layouts and scripts. People can use this window to add new components to a project or to open existing ones. View: A view is the interactive map that is used to display, query, and analyze data in ArcView. Several map layers--called Themes--are normally displayed in a single view. You can have more than one view in a project. Theme: "Theme" is an acronym used for a map layer in ArcView containing both spatial and attribute data. A Theme is a file containing graphic information required to draw a set of geographic features together with information about those features. Themes are listed on the left side of the view window in the Table of Contents along with the legends that represent them on the map. Table: A Table is a data file that contains rows of information about items in a particular geographic category such as hotels, cities, streets, counties, countries, etc., with each row representing a different named item. Tables have numerous columns, with each column representing a particular attribute. Tables are the main components of database stored in ArcView. Figure 2-2 is an example of creating different layers (boundary, creeks, etc) in one view. In this work, ArcView is used as a geographic tool to create the indoor navigation database. 2.2 Global Positioning System (GPS) GPS is a worldwide radio-navigation system formed from a constellation of 24 satellites and their ground stations. The location of a feature on the surface of the earth and its spatial relationship to other features around it are often determined by use of a GPS. (The Global Navigation Satellite System (GLONASS) deployed by the Russian Federations has much in common with GPS in terms of the satellite constellation, orbits and signal structure.) Figure 2-3 [5] illustrates how GPS works for the GPS receiver to find out its location.

PAGE 17

7 Figure 2-2 Different Layers in One View in ArcView Figure 2-3 How GPS Works The GPS receiver uses the geometric principle, trilateration, that allows one to find a location if its distance from other already-known locations is known. The receiver receives radio signal from four (or more) GPS satellites, calculates the distance from the satellite based on the time the signal takes to arrive at the receiver and then decides its exact location and altitude on the earth.

PAGE 18

8 In this project, GPS information is used to locate the Drishti user when he or she is outside and when GPS information is available. 2.3 ArcSDE ArcSDE [6] is the GIS gateway that helps manage spatial data in a DBMS and makes the data available to many kinds of applications, providing data and maps across the Internet. ArcSDE allows you to manage spatial data in one of four commercial databases (IBM DB2, Informix, Microsoft SQL Server™, and Oracle) and to serve ESRI’s file-based data with ArcSDE for Coverages. ArcSDE provides data to the ArcGIS Desktop products (ArcView, ArcEditor™, and ArcInfo™) and through ArcIMS, and it is a key component in managing a multi-user spatial database. ArcSDE supports spatial and non-spatial queries from clients. It can interact with a relational database management system (RDBMS) server for data storage and retrieval. It can also perform GIS operations on data. Figure 2-4 is an example of ArcSDE architecture. In this research, we load the database made by ArcView onto ArcSDE and send queries from the Drishti client manager to ArcSDE. ArcSDE works as a gateway for Drishti user and oracle8 DBMS. The spatial database of the “smart house” can be laid on top of a map of the University of Florida campus, so users will have a global idea of his location even when they are inside a building.

PAGE 19

9 2.4 ArcSDE Java API Com.esri.sde.client is the java appication-programming interface to build ArcSDE Figure 2-4 ArcSDE Architecture database queries. It uses Streams to transfer data between a SDE server and client. Input coordinates are gathered to build shapes and are compared with shapes fetched from the SDE server. If both shapes overlap within 1 foot, they are considered “close”. If isContaining operation of two shapes returns true, the first shape is said to be within the second shape. We use these operations to define the spatial location of the blind person within the “smart house.” 2.5 Hexamite Ultrasound Local Positioning System The Hexamite ultrasound local positioning system is offered by an OEM company, Hexamite, from Australia [7]. It harnesses ultrasound for high resolution high repeatability positioning. The highest resolution can reach up to 0.3 mm. It consists of at least two Hexamite positioning devices in which one device knows the distance to another. The device that knows the distance to the other is called a pilot and the other ArcSDE Architecture R7 Custom Apps Cad Client ArcSDEYour Database Solution 2-& 3-tier client/server solutions

PAGE 20

10 kind of device is called a beacon. The Hexamite ultrasound local positioning device may consist of a limitless number of pilots and beacons to form a large system as desired by the designer. The system is composed of two parts: custom software and location devices, which include HE900M pilots, HE900T beacons and a RS485/RS232 converter. The nominal value for speed of sound in air is 344m/s. Overall attenuation in air is due to geometric spreading, conduction and shear viscosity losses, molecular relaxation, boundaries, refraction by non-homogeneous atmosphere and diffraction by turbulence. The speed of sound may alter depending on the attenuation of the air. Distance between a pilot and a beacon is calculated by multiplying the speed of sound with half the time the sonic wave takes to travel to and from the beacon. The following picture Figure 2-5 illustrates the 6-point system that consists of 4 fixed pilots (1,2,3,4) and 2 moving beacons (5,6). The nature of the sonic wave sets the operation range and limits; most Hexamite local positioning systems use an ultrasound range of about 40 KHz that limits the operating range to about 20 m per point. Customers can use a number of devices to increase the monitored space and range. The Hexamite ultrasound local positioning system is a time-sharing system that requires synchronization. This can be accomplished by connecting the pilots together or by radio, light or sound. There are three ways of synchronization in this system: via RS485 serial input, via I/O pin or by sound. In this example, pilots 1, 2, 3 and 4 are connected together via RS485. One of the fixed pilots functions as a master that synchronizes beacons 5 and 6 with the built in sonic synchronization feature. The master initiates the timing or distance acquisition cycle of the whole system by sending out synchronization information when the cycle begins. All the pilots in the

PAGE 21

11 system transmit their distances to the two beacons one after another over the serial network during the cycle, and the last pilots sends information to the master pilot to trigger the next cycle. We adopt this 6-point Hexamite local positioning system as the foundation of the indoor navigation system, which is combined with the client manager, Drishti server and smart room database to make up a comprehensive guidance system. The details of how the Hexamite local positioning system works are illustrated in chapter 5. Figure 2-5 Six-Point Hexamite Local Positioning System 2.6 Voice Recognition and Speech Synthesis Because visually impaired people rely pretty much on voice communication, voice recognition and speech synthesis play an important part in this navigational system. Although some researchers use haptical aid in their guidance systems, we think visually

PAGE 22

12 impaired people will be more confident while traveling if their hands are free. The only problem is that the user may not be sensitive to the environment voice when they concentrate on the communication with the system. Voice or speech recognition is the ability of a machine or program to receive and interpret dictation, or to understand and carry out spoken commands. Using analog-todigital conversion, the user’s voice is captured by the microphone and converted into digital signals on a sound card. For a computer to decipher the signals it must have a digital database or vocabulary of words (in another word, phonemes) and a speedy means of comparing this data with signals. A comparator compares these stored patterns with the output signal of the analog-to-digital converter. The words that the comparator tries to match come from the grammar defined by the system designer. Speech synthesis is the computer-generated simulation of human speech. It is used to translate written information into aural information. The javax.speech package (javax.speech.synthesis and javax.speech.recognize) defines an abstract software representation of a speech engine to deal with either speech input or speech output. Javax.speech.synthesis package can easily convert plain text to simulated human speech. 2.7 OSGI OSGI (Open Service Gateway Initiative) is an industry plan for a standard way to connect different devices. Its specification is a java-based application layer framework that focuses exclusively on providing an open application layer and gateway interface for Services Gateways. Users are able to change from one monitoring service to another without having to install a new system of wires and devices or replace any of the networking infrastructures.

PAGE 23

13 Smart house has used many devices like automatic lamps, radios, doors, caregiver monitor systems, and alarm systems. This indoor location system is bundled as a single operation, so the administrator of a smart home can have the convenience of easily switching back and forth among different operations or running different services at the same time to enhance the function of individual bundle operation. 2.8 Wearable Computing Wearable computing [8] facilitates a new form of human-computer interaction based on a small body-worn computer system that is always on and always ready and accessible. There are five major characteristics of wearable computers: Portable while operational: The most distinguishing feature of wearable computing is it can be operational while moving. Hands-free use: This feature, along with portability, is more important for visually impaired people because their hands may be hurdled by the additional help of a long cane or guide dog. Sensors: A wearable computer can be augmented with different services and sensors like wireless communications, GPS, ultrasound, infrared etc. We use GPS for outdoor location and Hexamite ultrasound system for indoor location. “Attention-getting”: A wearable computer should be able to convey information to its user even when it is not actively being used. Always on: A wearable computer is always on and working, sensing and acting. These five distinguishing features add up to some of the advantages of this project. In this thesis XYBAUNAUT wearable computer is used. It can be used with a belt or integrated into a vest; it can also be carried directly on the body. Combined with a headset or a flat panel, it frees its user to work with his/her hands.

PAGE 24

14 2.9 Wireless Communication This project uses the 802.11 b wireless LAN networks that provide 11 Mbps of bandwidth.

PAGE 25

15 CHAPTER 3 OVERVIEW OF THE DRISHTI OUTDOOR NAVIGATION VERSION Blind and visually impaired people are at a disadvantage when they travel because they cannot get enough information about the location, orientation, traffic and obstacles on the way, things that can easily be seen by people without visual disabilities. They depend on repeatable, regular routes, and their living environment is limited because of their disability. Before technical support existed, they had to rely on guide dog and long canes when they traveled. The goal of this navigation system is to allow visually impaired people to travel through familiar and unfamiliar environments independently. The system usually consists of three parts: sensing the immediate environment for obstacles and hazards, providing information about location and orientation during travel and providing optimal routes towards the desired destination. 3.1 Review of Related Work 3.1.1 Obstacles And Hazards Detecting Guide dogs and long canes are the convention methods of navigation. Many new technologies have been used to help people travel with a greater degree of psychological comfort and independence. Early in 1988, Borenstein et al [9] completed a communication system with ultrasonic sensors for the blind. Their system is composed of three major subsystems: a mobile carriage, a robot mounted on it, and a computerized post next to the disabled person’s bed. The robot uses two ultrasonic range finders mounted on the vehicle to detect

PAGE 26

16 obstacles and provide information to detour them. There are other sensors like lightdetecting sensors, force sensors, a video camera and a speech recognition unit attached to the system to augment the navigation function. Sunita Ram and Jennie Sharf 10] designed the “People sensor,” which uses pyroelectric and ultrasound sensors to locate and differentiate between animate (human) and inanimate (non-human) obstructions in the detection path. Thus, it reduces the possibility of embarrassment by helping the user avoid inadvertent cane contact with other pedestrians and objects, and speaking to a person who is no longer with in hearing range. The system also measures the distance between the user and obstacles. John Zelek [11] is working on a technology, “the logical extension of the walking cane,” which provides visually impaired individuals with tactile feedback about their immediate environment. Two small, webcam-sized video cameras wired to a portable computer feed information into a special glove worn by the user. The glove has vibrating buzzers sewn into each finger that send impulses to the user warning of terrain fluctuations up to 30 feet ahead. Huosheng Hu and Penny Probert [12] did similar work using ultrasound beams to find the nearest obstacle on the path. They went one step further, using the frequency modulated ultrasound sensor to extract environment feature information. The sensor consists of a separate transmitter and receiver. It transmits the different signals as a continuous tone to the user through an earpiece and presents an auditory map of the environment. Different ranges to the obstacles appear as different pitches, and the loudness of the sound indicates how large a reflection occurred. The user can distinguish between single and multiple objects and learn the sound of particular

PAGE 27

17 feature shapes. The main disadvantage of this system is that it blocks the user’s sense of hearing, which might be vital source of information for visually impaired people. 3.1.2 Location And Orientation There are many ways to determine the location and orientation of the user. These vary in the extent to which they require sensors or information from the external environment. At one extreme, all kind of sensors are used to detect the user’s current information, as A. R. Golding and N. Lesh did [13]; at another extreme, no sensor is used, but a camera records images from the environment which are compared with 3D image models stored in computer, as S. Feiner et al. did[14]. In between are methods using a lot of local and global positioning systems, in which infrared, ultrasound transmitters, GPS or its Russian equivalent (GLONASS) are used to determine the current location and orientation. The most extreme system using multiple sensors is being done by Andrew Golding etc. They perform this context-aware task by using a set of cheap, wearable sensors that include a 3D accelerometer, a 3D magnetometer, a fluorescent light detector and a temperature sensor. The sensors are attached to a utility belt. The accelerometer detects the user’s acceleration in three dimensions, while the magnetometer measures the strength and direction of a magnetic field; the fluorescent light detector extracts the 60Hz component of the signal from a photodiode aimed at the ceiling to get the right direction, and the temperature sensor gets the room temperature. The data acquisition module continuously reads tuples of sensor readings at specific intervals and converts this information into canonical units. The raw sensor signals must be “cooked” to make them suitable for machine-learning algorithm. In other words, these raw readings are augmented with computed features. Then, the data modeling takes a

PAGE 28

18 model of the environment at training time, and the navigation model infers the user’s location at run time. Figure 3-1 is the structure of this navigation system. Data acquisition Data cooking Data modeling Navigation While (next sensor reading): Multiply in new sensor probabilities Redistribute prob.mass according to transition fn. End Training T esting Figure 3-1 Structure of Navigation System Using Wearable Sensors According to the experiment, the performance results are pretty good for a simplified office environment. In order to apply this method to a more complex world and obtain good accuracy, better cooking algorithms should be designed and performed appropriately. Another example about using sensors to detect the environment is VibraVest / ThinkTank developed by Steve Mann [15]. This apparatus is a computational tank top that is worn in close contact with the body, under ordinary clothing, to afford a synthetic synesthesia of a new sensory modality, namely radar, which gets translated to "feel". The chirplet transform, and other DSP methodology may detect targets accelerating toward the wearer, helping him or her to avoid bumping into things, and similarly making the

PAGE 29

19 wearer blind to targets that moving away from him or her, solving the "information overload" problem. The other extreme is to use a head-mounted camera and employ 3 D models. Sequential images are first geo-referenced manually and registered in a database. Then through the registered image the landmark lines are transferred on the other unregistered images by image-to-image matching based on straight-line features to get the accurate position and orientation for the real world images taken by the camera later [14,16]. If no common landmark lines can be clearly seen in two neighboring images, relative orientation is used to compute the new image’s translation and rotation relative to its predecessor by matching the neighboring images. Electric compass and gyroscope are necessary. To recognize a visual landmark in a cluttered environment is a very complex task approach because landmarks generally provide very different appearances depending on the location they are seen from. The difficulty lies in determining the X and Y coordinates and the yaw angel of the camera. The principle of image sequence analysis based on landmark lines is best illustrated in the “touring machine.”[14] It includes single image calibration, stereo images’ relative orientation, sequential image analysis and straight-line extraction and matching. This method may put an extra requirement on the wearable computer, to work with readily available peripherals, including high-performance 3D graphics cards. It also requires a previously--registered 3D image database and graphics interface for users to display the image and the contents of GIS database. All these requirements will surely

PAGE 30

20 increase the cost and decrease the response speed, making this system not so practical as it is supposed to be. The in between systems may be divided into two categories: one uses GPS information, the other use infrared or ultrasound transceivers. The key problem in a navigation system is to determine where the user is located, which then can be converted to the coordinates of a local GIS database to get the optimal path. Most current systems use GPS for this task. GPS is a worldwide radio-navigation system formed from a constellation of 24 satellites and their ground stations. It is normally accurate up to meters. Figure 3-2 Location Guidance System, Hideo Makino etc. Loomis was one of the first to propose the idea of a navigation system for the blind using GPS and acoustic information. In the 1990s, he built a navigational system for the blind using DGPS with an FM correction data receiver for the stable determination of the location of the traveler [18]. Hideo Makino et al. developed a system using GPS

PAGE 31

21 information in two basic units in 1997. The first is the mobile unit for the blind traveler, and the other is a base station for processing coordinates received from the traveler through the mobile telephone and offering geographical information back to the traveler. The error is 16 meters maximum. This system is illustrated in figure 3-2 [17], above. The GPS signal is affected mainly by the deliberate degradation of the signals, called selective availability (SA). To solve this problem, the obvious way is to increase the number of satellites available. GLONASS (GLObal Navigation Satellite System) is the Russian equivalent to GPS [19]. It has 19 operating satellites and is not affected by SA. In stand-alone mode GLONASS is accurate to 20 m. The stand-alone accuracy is about 10% better than GPS. Other solution may be Differential GPS (DGPS). DGPS receivers adopt two receivers communicating by a radio data link. One base receiver has fixed and known coordinates while the other is mobile. Errors in the signals arriving at the base receiver are computed and are used to correct the signals at the mobile one. The above location guidance system made by Hideo Makino uses the DGPS receiver. Although the accuracy of GPS or the combination of DGPS and GLONASS can reach cm level in some applications, this method does not work well for urban areas, where GPS signals are interrupted by moving vehicles, or are blocked by tall buildings, highway bridges or big trees. It will also not work indoors. There are many other ways to support navigation in areas where GPS information is hard to get. Some use active badges, beacon architectures or ceiling-mounted infrared transceiver system installed in the building [20, 21]. This approach requires a great deal of effort and expense to modify buildings. In [20], each transmitter and receiver used for position sensing is built into the buildings like Malls, auditoriums and conference halls. Each transmitter emits a unique

PAGE 32

22 ID number to the environment. Once the user passes the space with a transmitter built in, the receiver picks up the IDs from the IR transmitters and sends the information to the wearable computer to compute the accurate position of the user. This system also has a sparse 4-by-4-stimulator array that delivers directional cues by means of the sensory salutation phenomenon. The flaw of this technique is that when the user passes the location with the transmitter quickly, the IR signal from the transmitter may not be accepted by the receiver properly, and the user might receive the wrong location information. A variant proposal has three parts [21]: The first part is a mainframe computer that contains the database of all the IDs of the transmitters and is on all the time. It can direct the user about location and path. The second is a series of built-in transceivers and sensors connected to the mainframe computer, which can send the information of the IR signals to the mainframe. The third part is a headset that allows the user to communicate with the mainframe computer. This headset can also emit infrared light that can be detected by the transceivers and the information can be sent to mainframe via high speed Ethernet to locate the user. 3.2 Outdoor Version of Drishti Navigation System 3.2.1 System Design The outdoor version of Drishti done by Steve Moore [2] is a navigation system for visually impaired people that use a DGPS system to obtain the user’s outdoor location. The primary goal of this system is to augment a visually impaired person’s pedestrian experience with enough information so that they feel comfortable and at ease walking outside, even in an unfamiliar environment.

PAGE 33

23 Figure 3-3 is the client/proxy/server architecture of Drishti. The server and client manager are developed using Java. The mobile computer serves as a client to the DGPS server, which takes the user’s voice input and obtains the accurate location. It also communicates with GIS database to get the optimal route or contact the Police department, etc., if needed. Figure 3-3 Client/Proxy/Server Architecture of Drishti Because the size of the message is small, the system takes advantage of the User Datagram Protocol (UDP) sockets’ low overhead and avoids the delay of the Transmission Control Protocol (TCP) to divide a message into packets and reassemble it at the other end. The client manager residing in the wearable computer gets geometric information in longitude and latitude from the GPS receiver via serial port input and passes these GPS coordinates to the Navigation Manager. There are two Navigation Managers, one residing on the client side and the other on the server’s side. If it is on the

PAGE 34

24 Figure 3-4 Mobile Client Components Interaction client’s side, each GPS coordinate will be displayed in the Path Viewer, which is built using Java2D classes and allows users to view their current location and environment feature and route information; if it is on the server’s side, the Sender sends the location object to the server one by one. The server listener receives the GPS coordinates as one current location object and places it in the Navigation Manager (NM) queue where it waits for processing. The NM is a thread that will continually attempt to remove the coordinate packet from the queue and process it. Because the GPS coordinate object is updated every second, NM does not provide navigation prompts for each location. The DGPS Listener has one method for marking the next location object to be spoken to the user. The NM has a route object that contains the route the client is currently on. The NM asks the Route for prompt information. Prompt information contains the user’s current location and direction along the route. This prompt is wrapped as one object and sent back to the client, where it can be spoken out by the speech synthesizer to inform the

PAGE 35

25 user. Figure 3-5 shows a sample voice prompt of one route. !" Starting from Computer Science !" Turn left on to Hub Walkway 2 !" Travel on Hub Walkway 2 for 79 feet !" Turn left on to Stadium Road Walkway !" Travel on Stadium Walkway for 225 feet !" Turn left into stop #2 !" Starting from stop #2 !" Turn right on to Stadium Road Walkway !" Travel on Stadium Rd. Walkway for 225 feet !" Continue straight onto Hub Walkway !" Travel on Hub Walkway for 81 feet !" Turn left onto Black Hall Walkway !" Travel on Black Hall Walkway for 111 feet !" Turn right into Mathematics Figure 3-5 Sample Voice Prompt of a Route Given the current location and the destination, the system should return the optimal route. But for the visually impaired person, the optimal route does not necessarily mean the shortest route, because he/she may care more about safety. It is not uncommon for the shortest route to involve crossing roads; stairways or ramps that are not convenient for visually impaired people. One of the most important outcomes of Drishti compared to other systems is that Drishti can deliver the landmark information along the blind person’s path in real time, warn about potential hazards, generate routes preferable to the user, re-route the user if the current route is not available, if the sidewalk is under construction or if the user changed his/her mind to go to another place and add notes to the GIS database in the system for future processing in this case. This GIS database is made available to various campus departments like the University Police, the Physical Plant and Special Events so that they can insert and remove dynamic obstacles. The client has a ListBrowser that can provide the user with known building names. Then the

PAGE 36

26 FromToListener is activated by saying “route”. It will ask the user to say his or her starting place at the prompt “from,” and it expects the user to say his or her destination after it prompts “to”. Then it will send the request to the server, asking for routes. The user can also request the addition of new information by saying “add place” or “add end point.” Drishti downloads these place names from the GIS database and writes them in JSGF to a StringBuffer, which can be loaded as a grammar and activated. The new rule is added to the format of grammar and Drishti can understand the new places. Whenever the user requests a route, Drishti presents the optimal route from the current location to the destination according to the latest road information. Figure 3-6 displays the browse list and communication between the user and Drishti. The GIS is adopted to provide a spatial database of the environment, to inform the user if he/she is close to the building or needs to cross a speed bump or some stairs. It is accessed via a wireless network. Drishti obtained the GIS dataset for the UF campus from UF’s physical plant division. The scale of the dataset is a critical factor in navigation systems. The systematic error of the current GIS layers is 2 meters. Drishti accounts for the error while determining the user’s current location. As stated above, Drishti performs very well for outside navigation, but because of the attenuation of GPS signal due to buildings, trees or bridges, this system needs to be augmented to make it work for urban areas and especially for indoor navigation. This is the motivation for the integrated indoor/outdoor navigation system, in which the Hexamite ultrasound location system is adopted for indoor location. We changed the way Drishti communicated between user and system by adding another server to collect the

PAGE 37

27 coordinates via Hexamite system. I will talk in detail about this combined new system in the following chapters. User > “where can I go” Drishti > “known buildings are”, “Little”, “Music”, “Tigert” User > “more” Drishti > “Computer Science Engineering”, “Matherly” User > “departments” Drishti > “known departments are” “Mathematics”, Journalism” User > “more” Drishti > “Computer Science”, “Forestry”, “end of list” User > “Stop” Drishti > “ok” User> “route” Drishti > “from” User > “Mathematics” Drishti > “did you say Mathematics” User > “yes” Drishti > “to” User > “Computer Science” Drishti > “did you say Computer Science” User > “yes” Drishti > “ok, and away we go” Figure 3-6 User Browses List of Available Destinations and Requests a Route 3.2.2 COTS Hardware And Software Drishti uses some Commercial-Off-The-Shelf (COTS) hardware and software, including Trimble PROXRS, a 12 channel integrated GPS/Beacon/Satellite receiver with multi-path rejection technology, to receive GPS signals, and an XYBAUNAUT wearable computer for client request processing. The prototype weighs approximately 8 lbs. which is considered acceptable by most blind and disabled persons. The wearable computer as well as the GPS receiver is placed in the backpack. An integrated headset has an earphone and microphone that are used to give vocal commands and to query and receive

PAGE 38

28 route instruction, obstacle prompts and geometry information. IBM viaVoice interface is used as a vocal tool for the user and server communication. Drishti also uses ESRI’s COTS software, ArcView, to make spatial databases of sidewalks and ArcSDE for database management and route storage. The Network Analyst in the ArcView can generate the least-cost routes through a network.

PAGE 39

29 CHAPTER 4 THE INTEGRATED INDOOR/OUTDOOR DRISHTI 4.1 System Architecture Wearable Computer headset GPS receiver Figure 4-1 Wearable Mobile Client This thesis extends the outside version of Drishti to a complete navigational system by integrating an indoor position system. The Hexamite low cost positioning device is used

PAGE 40

30 to locate the user in indoor environments. The only things added on to the load of the user are two ultrasound transceivers that are smaller than a credit card and can be tagged onto the user’s shoulder using Velcro. Figure 4-1 depicts a user with all the equipment on a test run. Smart home is taken as an example to describe how this whole system works. The architecture is displayed in the following figure 4-2. Figure 4-2 Integrated Indoor/Outdoor/Client/Proxy/Location Server/Architecture The client communicates with the user via the headphone and microphone, enabled by IBM COTS software viaVoice. The user communicates with the microphone using the commands defined in the system grammar, making queries about his/her location, asking for route and obstacle prompts. If the user is outside, the client has two ways to get the location: one is through the Navigation Manager on the server side, which processes the coordinates and returns prompt information about the location; the other is through the DrishtiHexamit Bundle Indoor LocSrv System Voice In Voice Out Mobile Cli Server Side P Police Traffi Physical Plant ViaVoice Java COMM API: DGPS Win98 Wearable Spatial Database Engin ORDBMS UNIX Route Server

PAGE 41

31 Navigation Manager local to the client, which gets the coordinates directly from the GPS receiver and checks the current location status in the user’s path, which was first put in the client when the user asked for a route. Because this process is done locally it runs fast. The client Navigation Manager piles up all the requests in a queue and ask the Sender to send these request objects to the server. The server has a ClientListener that listens for requests from the client all the time. Once it receives a request, it will forward the request to a different queue according to the request type. Then the ClientServer, that is a server proxy dealing with all the requests from the client, asks different task managers to finish the requested tasks. If the request asks for a route, the server InfoReqManager gets the starting point and final destination and asks the SDEClient to get the path and puts it in the reply queue until the InfoSender sends it back to client. The client has InfoListener, LocationListener, fromtoListener, all listening to the server at all times. The InfoListener gets the packet and asks VocalView to speak to the user. If the user moves indoor, he/she can change the navigation mode to indoor by saying “Room” or “indoor” to Drishti. Then the user can ask a lot of information about the room and the layout of the furniture. If the request asks for the current location, the ClientServer asks InfoReqManager to get request object from the queue and send it to the SdeClient, that is a client sitting in the server proxy to connect to the ArcSDE server. SdeClient connects to both SDE server and Hexamite server. After the SdeClient gets the coordinates from the Hexamite server, it sends a query to the SDE server and gets the current location. The result is wrapped in a reply object and put in the reply queue. The

PAGE 42

32 server has an InfoSender, which picks up the object from the queue and sends it back to the client. 4.2 Interactions of Components 4.2.1 Client The client is composed of three main parts: a vocal interface, a GPS receiver and communication. Each part contains many functions or files as displayed in the following figure. Figure 4-3 Clients Three Components: Vocal Interface, DGPS Receiver and Communicator Vocal interface exploits the IBM COTS software viaVoice, which can understand what the user asks and talks to the user to reply to the user request. The Vocal interface can be programmed using javax.speech package. This package contains a recognizer, a synthesizer and a rule defined by the designer in a grammar. A Recognizer provides Human Voice Grammar Recognizer Synthesizer VocalView Synthesized Voice Sender Information Listener Directions Listener Sender Vocal Listener DGPS Listener Navigation Manager DGPS receiver Serial Port input

PAGE 43

33 access to speech recognition capabilities. The primary capabilities provided by a recognizer are grammar management and result handling. A Grammar defines a set of tokens (words) that may be spoken and the patterns in which those tokens may be spoken. We are using RuleGrammar format. RuleGrammar interface describes a Grammar that defines what users may say by a set of rules. The rules may be defined as rule objects that represent the rule in a data structure or as defined in the Java Speech Grammar Format (JSGF). The format of the rules we made are shown below: grammar fromto; public = where can i go {wherego} | places {wherego} | destinations {wherego} | where am i {whereami} | location {whereami} | how are you {howareyou} ; When a grammar is active, the recognizer listens for speech in the incoming audio that matches the grammar. When speech is detected, the recognizer produces a result. The result object is passed to the application and contains information about which words were heard. The primary function provided by the Synthesizer interface is the ability to speak text, speak Java Speech Markup Language text, and control an output queue of objects to be spoken. A Synthesizer is created by a call to the Central.createSynthesizer method. The default voice is male, and the language is English, which can be modified by the designer. In this project, we define a VocalView, which can speak plain text or String in JSML format.

PAGE 44

34 The Differential GPS receiver is connected to the serial port com2 on the wearable computer and is configured to output a NMEA 0183 sentence, which is an ASCII string that contains Global Positioning Fix Data. The format of the sentence is as follows: $GPGGA,hhmmss,xxxx.xx,a, yyyyy.yy,a,x,xx,x.x,x.x,M,x.x,M,x.x, xxxx*hh in which hhmmss is the UTC (Coordinated Universal Time) of the position in hours, minutes and seconds, xxxx.xx,a is the latitude, North/South and yyyyy.yy,a is the longitude, East/West. FromtoListener is the trigger of the client control, which calls VIClient to start various function calls according to the different requests proposed by the user. Once the user makes a request, a new result object is created when the recognizer detects an incoming speech that may match the grammar activated when the client first started. Once the recognizer completes recognition of the Result that it chooses to accept, it finalizes the result with a RESULT_ACCEPTED event that is issued to the ResultListeners attached to the Recognizer, matched Grammar, and the Result. The VIClient is invoked to perform different functions according to the accepted tokens that are expressed as String. The VIClient is the core of the client functions. All the managers and listeners are implemented as new threads. Many first-in-first-out queues are initialized which include the direction queue and coordinates queue for the local Navigation Manager if the navigation mode is set to local. The information request queue is also made here. The Sender is initialized to send requests from the queue. The direction listener and information listener start listening to the predefined port for the incoming reply object. Different functions are implemented by wrapping different requests in different packet headers. Each packet header identifies the type of request and packet body describing the detailed request. These packets are inserted into the queues where they wait to be sent to the server by the Sender.

PAGE 45

35 The Sender is one way of the communication bridge between client and server. It uses the UDP (User Datagram Protocol) communication protocol. The Sender is continuously working, filling up the packets by removing objects from the queue and sending it to the server. If the object is an information object, the Sender will wait for an acknowledgement from the server before it sends out the packet to make sure the server is activated and the packet will not be lost. If the object is a coordinate object, the Sender will send it without asking for acknowledgement because the coordinate object is updated every second and is continuously changing. The other feature of the server/client communication bridge is its various listeners. The DGPS Listener is a thread that registers as a serial port event listener that can be notified once there is input (a byte stream) from the DGPS receiver via the serial port. The listener parses the byte stream to get a (latitude, longitude, fix quality) tuple, and then creates a new coordinate object. This object is passed to the Sender, waiting to be sent out to the Navigation Manager on the server side if the user sets the navigation mode to be “setserver.” This object will be passed to the Navigation Manager on the client side if the navigation mode is “setlocal”. The latter can only be done after the server puts the route object back to the local Navigation Manager. Each coordinate is shown on the Path Viewer, which is a small panel for the visually impaired person to check his current status on the path. The Vocal View speaks out Navigation. There are two direction listeners implemented by a thread. One waits for the infoObject from the server about the navigation prompt. The other one works locally, looking for the direction queue and unwrapping the infoObject from the queue to get the local navigation prompt.

PAGE 46

36 Information Listener is a thread that always listens to the server for all the information objects except for navigation prompts. The information objects are extracted from the ByteArrayInputStream and the reply for the request is spoken to the user via VocalView. 4.2.2 ClientServer Proxy Figure 4-4 Client Manager Architecture Figure 4-4 illustrates the client manager architechture that manages the client server communication. In the server proxy, clientlistener is implemented using a thread that continuously listens for requests from the registered client address and extracts the incoming object according to different package header type. There are three kinds of packages (requests): information objects that will be handled by the Information Request Manager (IRM), GPS coordinate objects that will be handled by the Navigation Manager Hexamit Location System Location proxy bundle method invocation SDE Information Request Manager Route Server Navigation Manager Route Information Sender Client listener Direction Sender TCP/IP RPC put take put take put put take take FIFO Queues UDP UDP UDP

PAGE 47

37 (NM) to provide navigation prompt, and register objects that register the client address to the server. To make sure the information object is received by the server and since this kind of object is not time sensitive, it needs acknowledgement. The GPS coordinate object comes in at a very fast pace and is continuously changing, so it is not necessary to receive acknowledgement. The clientlistener puts the information object and GPS coordinate object into different first-come-first-serve queues for the information manager and navigation manager to use. How the GPS coordinate object is handled and how the Navigation Manager works are explained in Chapter 3 and figure 4-5. Here, I will describe how the indoor part of the server works. How the location server proxy is bundled and how it works will be illustrated in the next chapter. There are many kinds of information requests depending on the different queries the user makes. The IRM is implemented as a thread that continually takes the requests from the information queue. If the queue is empty, the IRM will keep waiting. Once a request comes up, say, a current location request, IRM will ask the VISDEClient to process the request and wrap the reply in an information object and put it in the reply queue from where the Sender will pick up the information object and send it to the client. The process is illustrated in the following flowchart.

PAGE 48

38 Figure 4-5 Work Process of DrishtiIndoor The VISDEClient needs two pieces of information to finish the user’s request for his or her current location. The first piece of information is the coordinate of the user and the second one is the relation of these coordinates to the indoor facility. To get the coordinates, VISDEClient asks HexClient to communicate with the Location System Bundle, which throws eventsabout the coordinates twice every second. This bundle sits in a different server, and is connected to the Hexamite ultrasound location system. I will explain this in the next section. With these coordinates, VISDEClient asks SDEClient to communicate with SDE (Spatial Database Engine) server. Then com.esri.sde.client java API is used to compute the relationship between the shapes from the SDE and the shapes made from the known coordinates. Different shapes are retrieved from the database and compared with the shape, which is a buffer with diameter 0.1 foot centered at the VIServerI VIServer VICServer InfoReqManager NavigationManager VISDEClient InfoSender DirectionSender RequestQueues ReplyQueues SDEClient Hexamite Location S Location Server Bundle TCP/IP HexClient SDE getLocation get coordinate InfoListener client Directions Listener HexDrishti UDP UDP ClientServer Location Server

PAGE 49

39 coordinate. If this shape is contained in the room shape, we can say the user is in that room. If this shape is within a specific distance to the furniture shape or room shape boundary, the system will prompt the user saying he/she is too close to the furniture or wall. If the request is about how to get to a place, for example, a room, we need some geometric calculation because SDE java API cannot satisfy the request. The Hexamite location system gives out the user’s coordinates as well as the orientation. We can use the orientation and the layout of the room plan to calculate the angel at which the user should turn and the distance ahead as illustrated below in Figure 4-6. Figure 4-6 Example of Geometric Calculation of Orientation As displayed in the above figure, the user can ask the system for directions to the desired destination, and the system may tell him/her the angle at which he should turn,

PAGE 50

40 the distance he has to travel, or it may correct the user’s orientation along the way and guide the user step-by-step to the destination.

PAGE 51

41 CHAPTER 5 LOCATION SERVER The location server for the Drishti is made up of two components. One is the Hexamite location system, which uses ultrasound devices for high-resolution tracking and guidance, developed by an OEM company called “Hexamite” in Australia. The other is the Drishti location server proxy, which uses Open Services Gateway Initiative (OSGI) to bundle the custom service software of the Hexamite system to provide the Drishti server with indoor location of the user. 5.1 Hexamite Location System 5.1.1 Hardware Components The Hexamite Ultrasound Location system consists of at least two or more Hexamite Positioning Devices, where one device knows the distance of another. The device that knows the distance to the other is called the pilot while the other device is called the beacon, as shown in Figure 5-1. In this project, the HE900M is the pilot and HE900T is the beacon. The system computes the distance between the pilots and beacons based on the time difference of the ultrasound traveling in-between them and the travel speed of ultrasound. The third part of the system is the RS485/RS232 converter, which connects all the pilots to the central computer. The HE900T scans for ultrasonic activity; if nothing is detected, the device goes progressively into a deeper and deeper sleep mode that saves power. It comes with an internal rechargeable Manganese Dioxide Lithium battery, which allows the user to wear these beacons without wire. The HE900T can be fully charged through the pin

PAGE 52

42 4(negative) and pin 8(positive) provided on the back of the device after 10 hours and can be discharged (used) for 10 hours. We attach two HE900Ts on the shoulder of the user to find out his or her location (coordinates) and orientation. Figure 5-1 Hexamite Location System Devices The HE900M is mounted on the ceiling facing the center of the house. It is connected to a RS485 network. The network is plugged into the serial port of the central computer through a RS485 to RS232 converter. The detection angle for the HE900M is 130 degrees at a distance of 6 meters. At a distance of 8 meters, this angle is 75 degrees. The detection range can be up to 16 m, and the maximum resolution is 0.3 mm. Any one of the pilots can be configured as the master. The master initiates the timing or distance acquisition cycle of the whole system by transmitting a synchronization signal at the beginning of the cycle. At the end of the cycle, the pilots transmit their positioning data one after another over the serial network. The location system can consist of a limitless number of Hexamite Positioning Devices configured as pilots and beacons to form a large system to achieve better precision of location. In this project we have four HE900M HE900T(beacon) HE900M(pilot)

PAGE 53

43 pilots. We set one at every corner of the smart house to give 360 degrees of coverage as represented in Figure 5-2. Figure 5-2 Hexamite Ultrasound Location System Coverage of the Smart House 5.1.2 Hardware Configuration The Hexamite Positioning Device can be configured through its serial port. It can be operated by and connected directly to a personal computer. The software in the central computer communicates with the hardware, calculates the position and orientation and creates location events for this location server. It first sends an escape control character to ready the devices for commands; then it reads the Settings.txt and sends the setup strings to the HE900Ms. If the setup configuration is feasible, the HE900M will configure Bath Refrigerator Stove TV Stand Sofa Bed Toilet Door_to_outside Door_to_Kitchen Door_to_Bed Door_to_Bath Bed_to_Bath Kitchen LivingRoom BedRoom BathRoom RS485/RS232 Converter 1 2 4 3

PAGE 54

44 themselves and return a “+”. After the custom software receives a “+” a carriage return (0D) is sent to the hardware to start an acquisition cycle. This configuration initialization is done in class Locaion.java. Figure 5-3 illustrates our sample Settings.txt file. Figure 5-3 Settings.txt for Hardware Configuration Once the Settings.txt file is found, the custom software transmits the string following the ASCII character # through the central computer’s serial port for the hardware configuration. This system consists of 6 devices, four of which are configured as pilots and two as beacons. The pilots are linked via the serial ports. The beacons are mobile and out of reach, so they are not linked. The first part of the string is the device address (DA). Q, O, S and Y are the secondary addresses of the four HE900Ms; these secondary addresses are the letters before its primary address. The primary addresses on the pilots for this system are P, N, R and X that correspond to the decimals 82, 80, 81 and 88 respectively. The second part of

PAGE 55

45 the setting line is Programming Control Byte (PCB), which configures the three slaves and one master. The slave is set as 01 while master is 11. A PCB of 11 corresponds to an integer representation of 17 that sets bits 4 (24) and 0 (20) as displayed in Figure 5-3. The slave should be configured before the master. They all transmit sonic synchronization. Of the four pilots, the pilot closest to the beacon is the one that synchronizes the beacon. The process is that the beacons send ultrasonic signal to the pilots at a predefined time interval, and after receiving the signal; the master pilot creates a distance string and passes the string to the next pilot configured by the termination byte. The third part is the Termination Byte (TB). If the termination byte of device 'X' is the primary address of device 'Y' on the same network, then device, 'Y' will be prompted to transmit its position acquisition result once device 'X' has completed it's transmission of it's results. The last device in the setting chain has a TB “Carriage return” (0D). The TB of master device is X, or hex decimal number 52, which indicates pilot R (secondary address S), so that after X finishes transmitting its distance string, pilot R will follow, then N (secondary address O), and then P (secondary address Q). P will call for another cycle by setting its TB as 0D. We also call pilots X, R, N and P pilot 1, 2, 3 and 4 respectively in Figure 16. The fourth part of the string is the Number of Beacons Byte (NBB), which is one more than the number of beacons the pilot looks for during the position acquisition cycle. So in this system, pilots are looking for 2 beacons. The fifth part indicates the number of pilots from which the beacon receives messages. In this system the number is 1. The sixth part is the number of devices in the system. In our case it is 6. The remaining part is about beacons and is neglected.

PAGE 56

46 5.1.3 Distance String The distance string is attached one by one from master pilot to slaves after the following transmission cycle: -0E22 0123 0C34 R0B33 0E13 23A1 N03AB 3345 09C3 P0202 0FE3 00F2 The hyphen “ – “ in front of the string means the device enters the position acquisition cycle. 0E22 is the distance between the master pilot and the nearest obstacle. The scalar of slave begins with the name of the pilot followed by the distance between that slave and the master, like R0B33, N03AB and P0202. 0123 is the hex decimal distance from the master pilot 1(pilot X) to the first beacon which is 291 mm. 0C34 is the distance from pilot X to the second beacon which is 3124 mm. 0E13 is the distance from pilot R to the first beacon, etc. We get all the distances in one cycle and choose the shortest distances to the beacons and calculate the distance using the following formula: shortest distance to beacon #1=shortest distance to beacon #1 /2 next shortest distance to beacon #1=Next shortest distance to beacon #1shortest distance to beacon #1 shortest distance to beacon #1 = 0.688 shortest distance to b eacon #1 next shortest distance to beacon #1 = 0.688 next shortest distance to b eacon #1 The formula for beacon #2 is also the same. The first time, we divide the shortest distance to the beacon by 2 because each of the beacon’s time must allow sound to travel from the pilot to the beacon and back; or two times the maximum defined range of the system. When we calculate the next closest distance to beacon, we subtract the shortest distance to the beacon from the original next shortest distance. This is because of the way pilots ping the beacons. At the beginning, all the pilots send out ultrasound at the same time to ping the transceivers (beacon in our case). The beacons only accept the first signal no matter who sent it; then they time stamp the signal and broadcast this signal back to all

PAGE 57

47 pilots. The pilots received this signal and record the time of arrival. So the time difference is the time the accepted signal spent to travel from the pilot to the beacon and the time the signal travels from the beacon back to the pilots. 0.688 is the resolution coefficient of the timer that measures the time of flight. The timer increments every 2 microseconds. The beacon transmits organized signals, which help each pilot to distinguish it from the others. This is also the reason the system tells the orientation of the user. Because the obstacles on the way from the pilot to the beacon easily reflect ultrasound, we choose the shortest distance and next shortest distance to calculate the location to avoid error. L22 – FM2 = H2 (5-1) L12 – SM2 = H2 (5-2) FM + SM = FS (5-3) FM = (L22 –L12 +FS2)/2FS (5-4) H = SQUR(L22 –FM2) (5-5) Figure 5-4 Location Calculation Trilateral FM and H can be used to get different coordinates in different scenarios. For example, when the beacon is closest to pilot 1 and pilot 2, the calculation process may be illustrated as in Figure 5-5. T S F H L1 L2 M

PAGE 58

48 Figure 5-5 Location Calculation Example In the above case, side L2 is the shortest distance and side L1 is the next shortest distance. The line TM is perpendicular to the line FS and parallel to the X-axis. So it is easy to see that the coordinates in this case can be expressed as (H, FS-SM) in the format of (X coordinate, Y coordinate). The class Tag.java calculates coordinates in 8 cases. Because the beacon has its own ID, we can distinguish the two beacons as left and right to get the orientation after the coordinates of both beacons are available. We assume the coordinates of the user’s head can be deduced at the middle of the left and right point. We can infer the following formula for the person’s location in the format of (X coordinate, Y coordinate). P represents the Person. P. X = (P.left.X – P.right.X)/2 +P.right.X (5-6) P. Y = (P.right.Y – P.left.Y)/2 + P.left.Y (5-7) F T L2 S M L1 --------------H (0,0)

PAGE 59

49 Figure 5-6 shows how to analyze the orientation when the user is within the range of 270 to 360 degree, which satisfies the condition that left.X > right.X and left.Y > right.Y. There are 8 different cases in orientation analysis in this system. Figure 5-6 Orientation and Position Analysis P.left.X>P.right.X and P.left.Y>P.right.Y 5.2 OSGI Location Service 5.2.1 OSGI OSGI stands for the Open Service Gateway Initiative, which is an open specification for the delivery of multiple services over wide area networks to local networks and devices. This specification is a Java-based application layer framework, as we can see from figure 5-7, that gives service providers, network operators, device makers, and appliance manufacturers a vendor neutral application and device layer APIs and functions. This strategy enables virtually all emerging home networking platforms, protocols and services to seamlessly inter operate with back end services, using existing residential telephone, cable TV, or electrical wiring. a b orientation when P.left.X > P.right.X and P.left.Y > P.right.Y orientation = 360 a b = a b = Math.atan((P.left.X P.right.X)/(P.left.Y P.r ight.Y)) so we get orientation = 360 Math.atan((P.left.X P.right.X)/(P.left.Y P.right.Y)) public Person getPosition (Person P) { //No matter which direction the person is facing //his position can be calculated as following formula P.X = (P.left.X P. right.X)/2+P.right.X P.Y = (P.right.Y P.left.Y)/2 + P.left .Y ........ return new Person(X, Y, orientation); } left right

PAGE 60

50 The layered architecture of OSGI (Figure 5-7) [23] has three components: frameworks, bundles and services. Framework is the core of OSGI specification, providing a common ground or environment to execute services from different vendors. It provides a general-purpose, secure, managed Java framework that supports the deployment of extensible and downloadable service applications know as bundles. Figure 5-7 OSGI Architecture In the OSGI environment, bundles are the only entities for deploying Java-based applications. A bundle is comprised of Java classes and other resources, which together can provide functions to end-users and provide components to other bundles, called services. A bundle is deployed as a Java Archive (JAR) file. JAR files are used to store applications, and their resources in a standard ZIP-based Java file format. Bundles can be installed to provide services for the user and removed once the service is no longer required. Installed bundles can register a number of services that can be shared with other bundles under strict control of the framework. The framework can manage the

PAGE 61

51 installation and update of bundles in an OSGI environment in a dynamic and scalable fashion. A JAR file of a bundle contains the resources to implement zero or more services. These resources may be class files for the Java programming language or other data files. contains a manifest file describing the contents of the JAR file and providing information about the bundle. states dependencies on other resources like Java packages that must be available to the bundle before its operation. sesignates a special class in the bundle to act as Bundle Activator. The frameworks must instantiate this class and invoke the START and STOP methods to start or stop the bundle. A bundle may have following states as shown in the flow chart of Figure 5-8. OSGI emphasizes the open environment for all different services. Everything is service in this case. Applications in OSGI are a collection of services even though they may come from different providers. Such OSGI services are defined by their service interface and implemented as service objects. The purpose of the service interface is to specify the semantics and the behavior of a service. The interface also makes the implementation transparent to the user. There can be many service implementations for a single service interface. This is one of the many beauties of OSGI. For example, a single standardized service interface to control the PC camera is followed by many camera vendors, while there may be many different camera brands and models. Service object is a Java object of a class that implements the service interface. Service object is owned by and run within a bundle; this bundle must register the service with the Framework service registry so that the service’s functionality is available to other bundles under control of the Framework.

PAGE 62

52 ---> Automatic transition Explicit transition Figure 5-8 State Diagram Bundle 5.2.2 Location Server with Indoor Location Service Bundle The custom indoor location service software communicates with Hexamite hardware to get the distance string, parses the string to obtain the shortest distance between the beacons and pilots, and then does some geometric analysis to provide the user with the location coordinates and orientation. These software files are packaged into an OSGI bundle to provide a generic service for location information so that multiple users can share the information simultaneously by registering to the service. Figure 5-9 illustrates the scheme of this location service.

PAGE 63

53 Figure 5-9 Location Service Scheme The way to bundle location services is based on the work done by Sree Kuchibhotla [24]. There are three bundles working together to provide the user with the location information, which can be shared by other bundle applications simultaneously if there are acquisitions for location at the same time. The indoor location service bundle contains all the custom location service software to return the Person object, which has the coordinates and orientation, twice every second. This service is activated by the Activator, which interacts with the OSGI Framework once the Framework user starts the bundle. The Person objects are thrown as events to the EventBroker bundle that has a thread listening to the event continuously. This EventBroker bundle acts like a middleman, receiving the event and throwing it out to its receivers. Any bundle that has registered to the EventBroker can be a receiver to grab this event. This is the virtue of the EventBroker, that it can serve many requests at the same time. In our case, the OSGI Framework Hexamite Hardware Activator Event Communication Module: Loation.java Processing Module:Person.java Tag.java House.java Person Object Hexamite Bundle Event Broker Location ServerBundle DrishtiIndoor Proxy Activator Other services which need location information

PAGE 64

54 LocationServer bundle gets the events thrown from the EventBroker and listens to any client request. Once a client communicates with the LocationServer, the client can get the Person object. This communication is implemented using TCP. Because of openness of OSGI, we can run many applications simultaneously on the Framework of the same computer. For example, a blind person using the indoor location system to guide him while walking in the smart house may ask the server to turn on the TV or radio, which is supported by another application bundle running concurrently on the server; he may ask Drishti the direction to get to the microwave and then ask another service to open the microwave door and cook for him! Your can use your imagination and add as many applications as you wish!

PAGE 65

55 CHAPTER 6 SUMMARY AND FUTURE WORK The previous chapters explain in detail the motivation, architecture, and implementation of the integrated navigation system for blind or visually impaired people. It also includes some related technologies used by this system. 6.1 Achievement and contribution This thesis is an exciting attempt to combine the indoor and outdoor navigation system into a complete system. It uses a lot of COTS software and hardware to provide blind people with increased convenience in traveling and living, allowing them to have their hands free while they do so. The equipment they need to wear is a small brick-sized wearable computer, a GPS receiver, a microphone-speaker headset and two ultrasound transceivers that are smaller than a credit card. The indoor component integrates seamlessly with the Drishti outdoor system by just adding two ultrasound transceivers. The user can transfer from the outdoor mode to the indoor mode by just saying “indoor” or “room” when he enters the indoor environment. He can convert back to outdoor mode once he goes outside by saying “navigate.” In the indoor mode, because this system is bundled as an OSGI service, the location information can be shared by other services running on the same server simultaneously. For example, the user may be monitored by the care-giving center in case of emergency, or he may be use other services like opening the door automatically once he is facing the door within some distance. Other applications can be added to the OSGI environment to give the user more convenience without affecting the existing service.

PAGE 66

56 6.2 Future Work The working range of this system depends on the coverage of the wireless network. To give the user a larger traveling range without the expense of wireless network installation, we plan to replace the wearable computer with a mobile phone to communicate with the server. This will also ease the burden of the user. The system assumes that the user is standing and estimates the vertical position of the beacons as an average person’s height. To make this system more realistic, the system may be augmented to 3D measurement by adding more pilots at the different height levels of the house and putting more beacons on different parts of the user’s body to get the real position of the user, including standing, sitting, bending, and lying to detect falls.

PAGE 67

57 REFERENCES [1] Virtanen, Ari, 2002. “Navigation and Guidance System for the Blind.” Available from URL: http://www.vtt.fi/aut/results/navi/navigationandguidancefortheblind.ppt Site last visited December 2002. [2] Moore, S. E., “Drishti: An Integrated Navigation System for the Visually Impaired and Dsabled.” Master’s thesis, University of Florida. [3] Madry, S.; Colvard,C.; Lathrop, R., 2002. “A New Paradigm-The GIS ‘Layer Cake.’" Available from URL: http://crssa.rutgers.edu/courses/geomatics_info/sld004.htm Site last visited December 2002. [4] Environmental Systems Research Institute. 2002. “ESRI, GIS and Mapping Software.” Available from URL: http://www.esri.com Site last visited December 2002. [5] Brain, M.; Harris, T., 2002. “How GPS Receivers Work.” Available from URL: http://www.howstuffworks.com Site last visited December 2002. [6] Environmental Systems Research Institute. 2002. “ArcSDE: the Gateway for GIS Data in a DBMS.” Available from URL: http://www.esri.com/library/brochures/pdfs/sdebro.pdf Site last visited December 2002. [7] Herian/Hexamite Cooperative. 2002. “Hexamite Positioning Devices Utilize Ultrasound for High Resolution High Repeatability Multidimensional Multipoint Guidance and Tracking.” Available from URL: http://www.hexamite.com Site last visited December 2002. [8] Mann, S., 1998. “Wearable Computing FAQ.” Available from URL: http://www.Wearcam.org/wearcompfaq.html Site last visited December 2002. [9] Borenstein, J.; Koren, Y., “Obstacle Avoidance with Ultrasonic Sensors.” Robotics and Automation, IEEE Journal of, Volume: 4 Issue: 2, April 1988 Page(s): 213 –218.

PAGE 68

58 [10] Ram, S.; Sharf, J., The People Sensor: A Mobility Aid for the Visually Impaired. Second International Symposium on Wearable Computers, Digest of Papers, 1998, Page(s): 166 [11] Zelek, J., 2002. The E. (Ben) & Mary Hochhausen Fund for Research in Adaptive Technology For Blind and Visually Impaired Persons. Available from URL: http://www.eos.uoguelph.ca/webfiles/zelek/cnib2.pdf Site last visited December 2002. [12] Hu, H.; Probert, P., The Oxford Project and the GEC AGV, a Chapter in Book Advanced Guided Vehicles: Aspects of the Oxford AGV Project. Pa g e (s): 9-16, Editors S. Cameron and P. J Probert, Oxford University (Oxford, UK): World Scientific Press, 1994. [13] Golding, A. R. and Lesh, N., Indoor Navigation Using a Diverse Set of Cheap, Wearable Sensors. The Third International Symposium on Wearable Computer, Digest of Papers, 1999, Page(s): 29-36. [14] Feiner, S.; MacIntyre, B.; Hollerer, T.; Webster, A., A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment. First International Symposium on Wearable Computers, Digest of Papers, 1997, Page(s): 74 [15] Mann, S., 1997. Existential Technology of Synthetic Synesthesia for the Visually Challenged. Available from URL: http://eyetap.org/wearcomp/isea97/index.html Site last visited December 2002. [16] T. Chen and Shibasaki, R., A Versatile AR Type 3D Mobile GIS Based on Image Navigation Technology. IEEE International Conference on Systems, Man and Cybernetics, 1999, Page(s): 1070-1075. [17] Makino, H.; Ishii, I.; Nakashizuka, Development of Navigation System for the Blind Using GPS and Mobile Phone Combination. Engineering in Medicine and Biology Society, 1996. Bridging Disciplines for Biomedicine, 18th Annual International Conference of the IEEE, Volume: 2, 1997, Page(s): 506 [18] Loomis, J.M.; Golledge, R.G.; Klatzky, R.L.; Speigle, J.M. & Tietz, J., Personal Guidance System for the Visually Impaired. Proc.1st Ann. ACM/SIGGAPH Conf. On Assistive Tech. Page(s): 85-91, 1994. [19] Walsh, D.; Capaccio, S.; Lowe, D.; Daly, P.; Shardlow, P. and Johnston, G., Real Time Differential GPS and GLONASS Vehicle Positioning In

PAGE 69

59 Urban Areas.” IEEE Conference on Intelligent Transportation System, 1998, Page(s): 514-519. [20] Kridner, C., “A Personal Guidance System for the Visually Disabled Population: The Personal Indoor Navigation System (PINS).” Available from URL: http://vision.psych.umn.edu/www/people/legge/5051/PGS2.pdf Site last visited December 2002. [21] Ertan, S.; Lee, C.; Willets, A.; Tan, H. and Pentland, A., “ A Wearable Haptic Navigation Guidance System.” The Second International Symposium on Wearable Computer, Digest of Papers, 1998, Page(s): 164165. [22] Helal, A.; Moore, S.E.; Ramachandran, B., “Drishti: An Integrated Navigation System for Visually Impaired and Disable.” Fifth International Symposium on Wearable Computers, Proceedings, 2001, Page(s): 149 – 156.

PAGE 70

60 BIOGRAPHICAL SKETCH I got my degree of Master of Science on Chemistry in China in 1996 and worked in the National Research Center of Certified Reference Materials for two years in Beijing, China. Then I found I was attracted deeply by computer science, one of the most promising fields in the world. In 1999 I came to America and started my study in the department of Computer and Information Science and Engineering in University of Florida in 2000. I immersed myself in the wonderful knowledge. During the Christmas time of 2001, I went to see Dr. Helal and began the work on the Drishti project. I really enjoyed the days and nights I stayed in the Harris and ICTA labs because the more effort I put on the work, the more I gained to increase my knowledge and experience. Having Drishti as my stepping-stone, I aim to achieve greater heights in my coming career.


Permanent Link: http://ufdc.ufl.edu/UFE0000774/00001

Material Information

Title: Drishti, Integrated Indoor/Outdoor Navigation System and Service
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0000774:00001

Permanent Link: http://ufdc.ufl.edu/UFE0000774/00001

Material Information

Title: Drishti, Integrated Indoor/Outdoor Navigation System and Service
Physical Description: Mixed Material
Copyright Date: 2008

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0000774:00001


This item has the following downloads:


Full Text











DRISHTI: AN INTEGRATED INDOOR/OUTDOOR NAVIGATION SYSTEM AND
SERVICE











By

YINGCHUN (LISA) RAN


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2003

























Copyright 2003

by

Yingchun (Lisa) Ran




























To my dear parents for their encouragement. To my dear husband Jeffery for all of his
deep love, support and consideration. To my lovely baby son Samuel, who has made the
great sacrifice of letting his mother stay in America to finish this research. His smile has
made this thesis possible.















ACKNOWLEDGMENTS

I would like to thank Dr. Abdelsalam Helal sincerely for his great work, guidance and

encouragement. I would like to express my deep thanks to Steve Moore for his wonderful

work on the Drishti outdoor version and his generous help on the spatial database, code

debugging and voice communication. I also thank Bryon Winkler for kindly answering

my numerous questions about the indoor location system.
















TABLE OF CONTENTS
Page

A C K N O W L E D G M E N T S ................................................................................................ iv

LIST OF FIGURES .................................... ... .. ... ........ .......... .... vii

A B S T R A C T ............... ................................................................................. ......... ..... ix

CHAPTERS

1 IN T R O D U C T IO N ............ .............................. .......................... .. ............. 1

2 REVIEW OF RELATED TECHNOLOGIES ....................................... .............. 3

2.1 Geographic Information System (GIS).................................................................. 3
2.1.1 Spatial D ata ................................................... .......................... ....... 3
2 .1.2 Sp atial D ata M odels........................................... .......................................... 4
2.1.3 Attribute D ata ........................................... ................. .. ......... 5
2.1.4 A rcV iew .............................................................. ... ............. 5
2.2 Global Positioning System (GPS)................................................... .................. 6
2 .3 A rcSD E ...................................... ................................. .................... 8
2 .4 A rcSD E Java A P I ......................... ..................................................... ................ 9
2.5 Hexamite Ultrasound Local Positioning System.................................................... 9
2.6 Voice Recognition and Speech Synthesis................................................... 11
2 .7 O S G I ............ .............................................................................. 12
2.8 W earable C om putting .................................................. ................................ 13
2.9 Wireless Communication...................... ....... ............................ 14

3 OVERVIEW OF THE DRISHTI OUTDOOR NAVIGATION VERSION ................. 15

3.1 R eview of R elated W ork ............................................................... .................... 15
3.1.1 Obstacles And H azards D etecting ......................................... ..... ......... 15
3.1.2 Location And Orientation ................ ................................... .. ... ........... 17
3.2 Outdoor Version of Drishti Navigation System.......................................... 22
3.2.1 System D esign ............................ ..... .... .. ..... .............. 22
3.2.2 COTS Hardware And Software .................... ........................ ........... 27

4 THE INTEGRATED INDOOR/OUTDOOR DRISHTI ........................................ 29

4.1 System A architecture ...... .................................................... .................... 29
4.2 Interactions of C om ponents ......... ................. ................................ .............. 32









4.2.1 Client............................................................ 32
4.2.2 C lientServer P roxy ......... ................. ................. .................... .............. 36

5 LO CA TION SERV ER .................... ............................................... ........................... 41

5.1 Hexamite Location System ........................... ........................ 41
5.1.1 H ardw are C om ponents................................................ ........................... 41
5.1.2 H ardw are Configuration ................ .......................................................... 43
5.1.3 D instance String................... ............. .................... .. .............. .. ............ 46
5.2 O SG I L location Service ............. ...................................................... .............. 49
5.2.1 O SG I .......................................... ... ...... ... ....... .............. 49
5.2.2 Location Server with Indoor Location Service Bundle .................................. 52

6 SUMMARY AND FUTURE WORK ........................................ ........................ 55

6.1 A chievem ent and contribution ............................................................. .............. 55
6 .2 F future W ork ........... .... ................................................. ........................... 56

R E FE R E N C E S .... ............................................................................. .......... ...... 57

B IO G R A PH ICA L SK ETCH ......... ................. ........................................................... 60
















LIST OF FIGURES

Figure Page

2-1 A n Exam ple of Layers in G IS.............. ............................... .................. ............... 4

2-2 Different Layers in One View in ArcView ................................. ........................ 7

2-3 How GPS Works ................................... ...... ... ................. .7

2-4 A rcSD E A rchitecture......... .... ...................... ................ ............................ .9

2-5 Six-Point Hexamite Local Positioning System........................................................11

3-1 Structure of Navigation System Using Wearable Sensors.................. .............. 18

3-2 Location Guidance System, Hideo Makino etc. ................................... ............... 20

3-3 Client/Proxy/Server Architecture of Drishti......... ....... ................ .............23

3-4 M obile Client Com ponents Interaction ........................................ ........ ............... 24

3-5 Sample Voice Prompt of a Route .............................................................................25

3-6 User Browses List of Available Destinations and Requests a Route ...........................27

4-1 W earable M obile C lient......... ......... ........ .......... .......................... ............... 29

4-2 Integrated Indoor/Outdoor/Client/Proxy/Location Server/Architecture.......................30

4-3 Clients Three Components: Vocal Interface, DGPS Receiver and Communicator ........32

4-4 C lient M manager A architecture ........................................ ............................................36

4-5 W ork Process of D rishtindoor............................ ................................. ............... 38

4-6 Example of Geometric Calculation of Orientation..................................39

5-1 H exam ite Location System D devices ..................................................... .... ........... 42

5-2 Hexamite Ultrasound Location System Coverage of the Smart House.........................43

5-3 Settings.txt for Hardware Configuration ........................................ ...... ............... 44









5-4 Location Calculation Trilateral ........................................................ ...............47

5-5 Location Calculation Example ............................................... ............................. 48

5-6 Orientation and Position Analysis P.left.X>P.right.X and P.left.Y>P.right.Y................49

5 -7 O S G I A rch itectu re ...........................................................................................................5 0

5-8 State D iagram B undle............. ................................................ ...... ...... ........... ... ..52

5-9 L location Service Schem e ........................................................................ .................. 53











































viii















Abstract of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

DRISHTI: AN INTEGRATED INDOOR/OUTDOOR NAVIGATION SYSTEM AND
SERVICE

By

Yingchun (Lisa) Ran

May 2003


Chair: Abdelsalam (Sumi) Helal
Major Department: Computer and Information Science and Engineering

Drishti is an integrated indoor/outdoor navigation system for visually-impaired people.

It uses a precise position measurement system, a wireless connection, a wearable

computer, and a vocal communication interface to guide users and help them travel

independently and safely. In the outdoors, Drishti uses DGPS as its location system to

keep the user as close as possible to the central line of sidewalks; it provides the user with

an optimal route by means of its dynamic routing and rerouting ability. The user can

switch the system from an outdoor to an indoor environment with a simple vocal

command. An ultrasound location system called "Hexamite" is used to provide very

precise indoor location measurements. The user can ask information about the room's

layout and the positioning of any furnishings. The user's location is then compared to the

spatial database of the "smart house" and the relationship between the user and the indoor

facilities is computed. Drishti then gives travel prompts about possible obstacles to the









user to help him/her avoid injury. Drishti also provides the user with step-by-step walking

guidance.

The indoor service of Drishti is bundled under the OSGI framework to make it

compatible with other services offered by smart houses, such as opening the door for a

visitor, checking the weather using a phone, etc.














CHAPTER 1
INTRODUCTION

Statistics [1] indicate that there are approximately 10 to 11 million blind or visually

impaired people in North America, and this number is growing at an alarming rate. As

many of these people have difficulty knowing where they are or where they are going,

frequently feeling totally disorientated or even isolated, navigational guidance is very

important for them. Navigation involves updating one's position and orientation while he

or she is traveling an intended route, and in case the person becomes lost, reorienting and

reestablishing a route to the destination. Guiding people is about giving them more

information that usually includes obstacle prompting.

Visually impaired people not only have a very limited reachable world but also depend

on repetitive and predefined routes with minimum obstacles. At times these routes may

be subject to change: a sidewalk may be blocked for roadwork, a fallen branch or

temporary puddle of water after a heavy rain on the route may be dangerous to the people

who could not see it. A guide dog or long cane may help detect the problem, but blind

people need more information to find detours or rearrange routes.

This thesis is based on the outdoor version of Drishti navigation system done by Steve

Edwin Moore [2]. The outdoor version of Drishti uses DGPS to locate the user in an

outdoor environment, answers the user's various requests and gives information about

routing and rerouting dynamically according to changes in the environment. In this thesis

we extend the Drishti outdoor version to support indoor navigation. In an indoor

environment, traveling is even more difficult because the space is relatively small and









there are a lot of narrow hallways, stairs, doors and furniture, so visually impaired people

may face closer obstacles. They may very likely stumble over obstacles. If they are new

to the environment, it is very dangerous for them to walk alone. This system tells the user

the layout of the indoor facility, and gives him/her a big picture of what the environment

is like. The user may also get distance and navigation information between destinations.

On the way he/she can ask the obstacle prompt to guarantee travel safety. The system can

also communicate with the user and answer different requests.

Because GPS is not available in indoor situations, and because the requirements of

measurement error change, the Drishti system switches to a different positioning service

called Hexamite for indoor use and prompts the user with the indoor room layout using

the example of Smart House. Since the indoor space is smaller and more crowded than

the outdoors, a high precision measurement scale is provided.















CHAPTER 2
REVIEW OF RELATED TECHNOLOGIES

Many mature and commercial technologies are used in this research to provide

comprehensive (indoor/outdoor) navigational guidance. In the following, I briefly

describe each of these technologies.

2.1 Geographic Information System (GIS)

GIS is a complex computer system that incorporates technologies from a wide range

of disciplines, including, but not limited to remote sensing, cartography, surveying,

geodesy, photogrammetry, geography and computing. It combines layers of information

about a place to provide a better understanding of it.

With GIS, you may combine many layers of information according to your own

purpose. The real power of a GIS is its ability to integrate various data layers and perform

data analyses between the data layers.

Figure 2-1 is an example of data layers used to describe a piece of land, including

information about hydrology, soils, roads, elevation, land use, etc, each piece of

information being one layer.

2.1.1 Spatial Data

Once the layers are compiled, we can analyze the information they represent. The

information on the layer map is spatial data. Spatial data contain the coordinates and

identifying information for various map features. There are mainly three kinds of

features: points, lines and polygons (areas). Buildings can be represented as polygons;











road, railways and rivers are all lines, well and city in a marketing layer may be

considered points.


















Figure 2-1 An Example of Layers in GIS


2.1.2 Spatial Data Models

There are two kinds of spatial data models: raster and vector. The raster format uses an

array of grid cells or pixels. Each grid cell is referenced by a row and column number and

contains a number representing the type or value of the attribute being mapped. Each grid

cell represents an area on the surface of the earth and the average value of whatever

attribute is being considered for that particular place. Real world features are assumed to

be present or absent from any given square. The smaller the square size is, the more

accurate the representation of the real world feature is. There are no points, lines and

polygons.

In Vector GIS, we represent real world features abstractly as mathematical vectors

located in a Cartesian (x, y, z) coordinate space. Vector technology uses a series of lines










to define the boundary of the object of interest. In this thesis we are using vector model.

Points, lines, polygons are vector based GIS.










Point Line P. 1'' ni


2.1.3 Attribute Data

Attribute data is another type of GIS data that is not on the layer map but that can be

associated to the map through links to the spatial data. Say that a point representing a city

in a marketing layer is spatial data, then the amount of coke sold in that city and the

population of that city are attribute data.

2.1.4 ArcView

ArcView is a powerful tool made by Environmental Systems Research Institute

(ESRI) for the management, display, query, and analysis of spatial information. With the

knowledge of spatial data, attribute data and GIS layers, we can easily build views,

tables, charts, layouts, scripts and wrap them together into a project to represent the

relationship between spatial data and attribute data. Following is the basic knowledge of

ArcView.

* Project: A project is a file in which ArcView stores the user's work. All related work
can be wrapped in a single project, including tables, charts, spatial views of your data,
map layouts, etc. When that project file is opened again, all its wrapped component
parts will be ready to use again. Each project has a window.

* Project Window: The project window is the smaller window on the left of the initial
ArcView window. The initial "untitled" name of the project will be changed to the
name the user defined with an .apr extension. It lists all the components of the project












in the order of views, tables, charts, layouts and scripts. People can use this window
to add new components to a project or to open existing ones.

* View: A view is the interactive map that is used to display, query, and analyze data in
ArcView. Several map layers--called Themes--are normally displayed in a single
view. You can have more than one view in a project.

* Theme: "Theme" is an acronym used for a map layer in ArcView containing both
spatial and attribute data. A Theme is a file containing graphic information required
to draw a set of geographic features together with information about those features.
Themes are listed on the left side of the view window in the Table of Contents along
with the legends that represent them on the map.

* Table: A Table is a data file that contains rows of information about items in a
particular geographic category such as hotels, cities, streets, counties, countries, etc.,
with each row representing a different named item. Tables have numerous columns,
with each column representing a particular attribute. Tables are the main components
of database stored in ArcView.

Figure 2-2 is an example of creating different layers (boundary, creeks, etc) in one

view.

In this work, ArcView is used as a geographic tool to create the indoor navigation

database.

2.2 Global Positioning System (GPS)

GPS is a worldwide radio-navigation system formed from a constellation of 24

satellites and their ground stations. The location of a feature on the surface of the earth

and its spatial relationship to other features around it are often determined by use of a

GPS. (The Global Navigation Satellite System (GLONASS) deployed by the Russian

Federations has much in common with GPS in terms of the satellite constellation, orbits

and signal structure.) Figure 2-3 [5] illustrates how GPS works for the GPS receiver to


find out its location.












File Edit Viev; Iheme Graphic; Window Help


8so. Bourdary shp

Vj Creeks.shp
Al
vj Cenlineshp


vj Cn 9j' ip
Minor
7/'Rural


.'*' ;' S, -I"






,- -
:. .


Figure 2-2 Different Layers in One View in ArcView












HNolzon
D 2M0 How St War Your Location


Figure 2-3 How GPS Works


The GPS receiver uses the geometric principle, trilateration, that allows one to find a

location if its distance from other already-known locations is known. The receiver

receives radio signal from four (or more) GPS satellites, calculates the distance from the

satellite based on the time the signal takes to arrive at the receiver and then decides its

exact location and altitude on the earth.









In this project, GPS information is used to locate the Drishti user when he or she is

outside and when GPS information is available.

2.3 ArcSDE

ArcSDE [6] is the GIS gateway that helps manage spatial data in a DBMS and makes

the data available to many kinds of applications, providing data and maps across the

Internet. ArcSDE allows you to manage spatial data in one of four commercial databases

(IBM DB29, Informix, Microsoft SQL ServerTM, and Oracle) and to serve

ESRI's file-based data with ArcSDE for Coverages. ArcSDE provides data to the ArcGIS

Desktop products (ArcView, ArcEditorTM, and ArcInfoTM) and through ArcIMS, and

it is a key component in managing a multi-user spatial database.

ArcSDE supports spatial and non-spatial queries from clients. It can interact with a

relational database management system (RDBMS) server for data storage and retrieval. It

can also perform GIS operations on data. Figure 2-4 is an example of ArcSDE

architecture.

In this research, we load the database made by ArcView onto ArcSDE and send

queries from the Drishti client manager to ArcSDE. ArcSDE works as a gateway for

Drishti user and oracle8 DBMS. The spatial database of the "smart house" can be laid on

top of a map of the University of Florida campus, so users will have a global idea of his

location even when they are inside a building.









2.4 ArcSDE Java API

Com.esri.sde.client is the java appication-programming interface to build ArcSDE




ArcSDE Architecture











Figure 2-4 ArcSDE Architecture
client/server
solutions

Database
Solution




Figure 2-4 ArcSDE Architecture

database queries. It uses Streams to transfer data between a SDE server and client. Input

coordinates are gathered to build shapes and are compared with shapes fetched from the

SDE server. If both shapes overlap within 1 foot, they are considered "close". If

isContaining operation of two shapes returns true, the first shape is said to be within the

second shape. We use these operations to define the spatial location of the blind person

within the "smart house."

2.5 Hexamite Ultrasound Local Positioning System

The Hexamite ultrasound local positioning system is offered by an OEM company,

Hexamite, from Australia [7]. It harnesses ultrasound for high resolution high

repeatability positioning. The highest resolution can reach up to 0.3 mm. It consists of at

least two Hexamite positioning devices in which one device knows the distance to

another. The device that knows the distance to the other is called a pilot and the other










kind of device is called a beacon. The Hexamite ultrasound local positioning device may

consist of a limitless number of pilots and beacons to form a large system as desired by

the designer. The system is composed of two parts: custom software and location devices,

which include HE900M pilots, HE900T beacons and a RS485/RS232 converter.

The nominal value for speed of sound in air is 344m/s. Overall attenuation in air is due

to geometric spreading, conduction and shear viscosity losses, molecular relaxation,

boundaries, refraction by non-homogeneous atmosphere and diffraction by turbulence.

The speed of sound may alter depending on the attenuation of the air. Distance between a

pilot and a beacon is calculated by multiplying the speed of sound with half the time the

sonic wave takes to travel to and from the beacon.

The following picture Figure 2-5 illustrates the 6-point system that consists of 4 fixed

pilots (1,2,3,4) and 2 moving beacons (5,6). The nature of the sonic wave sets the

operation range and limits; most Hexamite local positioning systems use an ultrasound

range of about 40 KHz that limits the operating range to about 20 m per point. Customers

can use a number of devices to increase the monitored space and range.

The Hexamite ultrasound local positioning system is a time-sharing system that

requires synchronization. This can be accomplished by connecting the pilots together or

by radio, light or sound. There are three ways of synchronization in this system: via

RS485 serial input, via I/O pin or by sound. In this example, pilots 1, 2, 3 and 4 are

connected together via RS485. One of the fixed pilots functions as a master that

synchronizes beacons 5 and 6 with the built in sonic synchronization feature.

The master initiates the timing or distance acquisition cycle of the whole system by

sending out synchronization information when the cycle begins. All the pilots in the









system transmit their distances to the two beacons one after another over the serial

network during the cycle, and the last pilots sends information to the master pilot to

trigger the next cycle.

We adopt this 6-point Hexamite local positioning system as the foundation of the

indoor navigation system, which is combined with the client manager, Drishti server and

smart room database to make up a comprehensive guidance system. The details of how

the Hexamite local positioning system works are illustrated in chapter 5.



























Figure 2-5 Six-Point Hexamite Local Positioning System

2.6 Voice Recognition and Speech Synthesis

Because visually impaired people rely pretty much on voice communication, voice

recognition and speech synthesis play an important part in this navigational system.

Although some researchers use haptical aid in their guidance systems, we think visually









impaired people will be more confident while traveling if their hands are free. The only

problem is that the user may not be sensitive to the environment voice when they

concentrate on the communication with the system.

Voice or speech recognition is the ability of a machine or program to receive and

interpret dictation, or to understand and carry out spoken commands. Using analog-to-

digital conversion, the user's voice is captured by the microphone and converted into

digital signals on a sound card. For a computer to decipher the signals it must have a

digital database or vocabulary of words (in another word, phonemes) and a speedy means

of comparing this data with signals. A comparator compares these stored patterns with

the output signal of the analog-to-digital converter. The words that the comparator tries to

match come from the grammar defined by the system designer.

Speech synthesis is the computer-generated simulation of human speech. It is used to

translate written information into aural information.

The j avax. speech package (j avax. speech. synthesis and j avax. speech.recognize) defines

an abstract software representation of a speech engine to deal with either speech input or

speech output. Javax.speech.synthesis package can easily convert plain text to simulated

human speech.

2.7 OSGI

OSGI (Open Service Gateway Initiative) is an industry plan for a standard way to

connect different devices. Its specification is a java-based application layer framework

that focuses exclusively on providing an open application layer and gateway interface for

Services Gateways. Users are able to change from one monitoring service to another

without having to install a new system of wires and devices or replace any of the

networking infrastructures.











Smart house has used many devices like automatic lamps, radios, doors, caregiver

monitor systems, and alarm systems. This indoor location system is bundled as a single

operation, so the administrator of a smart home can have the convenience of easily

switching back and forth among different operations or running different services at the

same time to enhance the function of individual bundle operation.

2.8 Wearable Computing

Wearable computing [8] facilitates a new form of human-computer interaction based

on a small body-worn computer system that is always on and always ready and

accessible.

There are five major characteristics of wearable computers:

* Portable while operational: The most distinguishing feature of wearable computing is
it can be operational while moving.

* Hands-free use: This feature, along with portability, is more important for visually
impaired people because their hands may be hurdled by the additional help of a long
cane or guide dog.

* Sensors: A wearable computer can be augmented with different services and sensors
like wireless communications, GPS, ultrasound, infrared etc. We use GPS for outdoor
location and Hexamite ultrasound system for indoor location.

* "Attention-getting": A wearable computer should be able to convey information to its
user even when it is not actively being used.

* Always on: A wearable computer is always on and working, sensing and acting.

These five distinguishing features add up to some of the advantages of this project.

In this thesis XYBAUNAUT wearable computer is used. It can be used with a belt or

integrated into a vest; it can also be carried directly on the body. Combined with a

headset or a flat panel, it frees its user to work with his/her hands.






14


2.9 Wireless Communication

This project uses the 802.11 b wireless LAN networks that provide 11 Mbps of

bandwidth.














CHAPTER 3
OVERVIEW OF THE DRISHTI OUTDOOR NAVIGATION VERSION

Blind and visually impaired people are at a disadvantage when they travel because

they cannot get enough information about the location, orientation, traffic and obstacles

on the way, things that can easily be seen by people without visual disabilities. They

depend on repeatable, regular routes, and their living environment is limited because of

their disability. Before technical support existed, they had to rely on guide dog and long

canes when they traveled.

The goal of this navigation system is to allow visually impaired people to travel

through familiar and unfamiliar environments independently. The system usually consists

of three parts: sensing the immediate environment for obstacles and hazards, providing

information about location and orientation during travel and providing optimal routes

towards the desired destination.

3.1 Review of Related Work

3.1.1 Obstacles And Hazards Detecting

Guide dogs and long canes are the convention methods of navigation. Many new

technologies have been used to help people travel with a greater degree of psychological

comfort and independence.

Early in 1988, Borenstein et al [9] completed a communication system with ultrasonic

sensors for the blind. Their system is composed of three major subsystems: a mobile

carriage, a robot mounted on it, and a computerized post next to the disabled person's

bed. The robot uses two ultrasonic range finders mounted on the vehicle to detect











obstacles and provide information to detour them. There are other sensors like light-

detecting sensors, force sensors, a video camera and a speech recognition unit attached to

the system to augment the navigation function.

Sunita Ram and Jennie Sharf 10] designed the "People sensor," which uses

pyroelectric and ultrasound sensors to locate and differentiate between animate (human)

and inanimate (non-human) obstructions in the detection path. Thus, it reduces the

possibility of embarrassment by helping the user avoid inadvertent cane contact with

other pedestrians and objects, and speaking to a person who is no longer with in hearing

range. The system also measures the distance between the user and obstacles.

John Zelek [11] is working on a technology, "the logical extension of the walking

cane," which provides visually impaired individuals with tactile feedback about their

immediate environment. Two small, webcam-sized video cameras wired to a portable

computer feed information into a special glove worn by the user. The glove has vibrating

buzzers sewn into each finger that send impulses to the user warning of terrain

fluctuations up to 30 feet ahead. Huosheng Hu and Penny Probert [12] did similar work

using ultrasound beams to find the nearest obstacle on the path. They went one step

further, using the frequency modulated ultrasound sensor to extract environment feature

information. The sensor consists of a separate transmitter and receiver. It transmits the

different signals as a continuous tone to the user through an earpiece and presents an

auditory map of the environment. Different ranges to the obstacles appear as different

pitches, and the loudness of the sound indicates how large a reflection occurred. The user

can distinguish between single and multiple objects and learn the sound of particular










feature shapes. The main disadvantage of this system is that it blocks the user's sense of

hearing, which might be vital source of information for visually impaired people.

3.1.2 Location And Orientation

There are many ways to determine the location and orientation of the user. These vary

in the extent to which they require sensors or information from the external environment.

At one extreme, all kind of sensors are used to detect the user's current information, as A.

R. Golding and N. Lesh did [13]; at another extreme, no sensor is used, but a camera

records images from the environment which are compared with 3D image models stored

in computer, as S. Feiner et al. did[14]. In between are methods using a lot of local and

global positioning systems, in which infrared, ultrasound transmitters, GPS or its Russian

equivalent (GLONASS) are used to determine the current location and orientation.

The most extreme system using multiple sensors is being done by Andrew Golding

etc. They perform this context-aware task by using a set of cheap, wearable sensors that

include a 3D accelerometer, a 3D magnetometer, a fluorescent light detector and a

temperature sensor. The sensors are attached to a utility belt. The accelerometer detects

the user's acceleration in three dimensions, while the magnetometer measures the

strength and direction of a magnetic field; the fluorescent light detector extracts the 60Hz

component of the signal from a photodiode aimed at the ceiling to get the right direction,

and the temperature sensor gets the room temperature.

The data acquisition module continuously reads tuples of sensor readings at specific

intervals and converts this information into canonical units. The raw sensor signals must

be "cooked" to make them suitable for machine-learning algorithm. In other words, these

raw readings are augmented with computed features. Then, the data modeling takes a









model of the environment at training time, and the navigation model infers the user's

location at run time. Figure 3-1 is the structure of this navigation system.


Data acquisition


Data cooking


Training T testing
Navigation
Data modeling While (next sensor reading):
Multiply in new sensor
probabilities
Redistribute prob.mass
according to transition
fn.
_____ -End



Figure 3-1 Structure of Navigation System Using Wearable Sensors

According to the experiment, the performance results are pretty good for a

simplified office environment. In order to apply this method to a more complex world

and obtain good accuracy, better cooking algorithms should be designed and performed

appropriately.

Another example about using sensors to detect the environment is VibraVest /

ThinkTank developed by Steve Mann [15]. This apparatus is a computational tank top

that is worn in close contact with the body, under ordinary clothing, to afford a synthetic

synesthesia of a new sensory modality, namely radar, which gets translated to "feel". The

chirplet transform, and other DSP methodology may detect targets accelerating toward

the wearer, helping him or her to avoid bumping into things, and similarly making the











wearer blind to targets that moving away from him or her, solving the "information

overload" problem.

The other extreme is to use a head-mounted camera and employ 3 D models.

Sequential images are first geo-referenced manually and registered in a database. Then

through the registered image the landmark lines are transferred on the other unregistered

images by image-to-image matching based on straight-line features to get the accurate

position and orientation for the real world images taken by the camera later [14,16]. If no

common landmark lines can be clearly seen in two neighboring images, relative

orientation is used to compute the new image's translation and rotation relative to its

predecessor by matching the neighboring images. Electric compass and gyroscope are

necessary.

To recognize a visual landmark in a cluttered environment is a very complex task

approach because landmarks generally provide very different appearances depending on

the location they are seen from. The difficulty lies in determining the X and Y

coordinates and the yaw angel of the camera. The principle of image sequence analysis

based on landmark lines is best illustrated in the "touring machine."[14] It includes single

image calibration, stereo images' relative orientation, sequential image analysis and

straight-line extraction and matching.

This method may put an extra requirement on the wearable computer, to work with

readily available peripherals, including high-performance 3D graphics cards. It also

requires a previously--registered 3D image database and graphics interface for users to

display the image and the contents of GIS database. All these requirements will surely










increase the cost and decrease the response speed, making this system not so practical as

it is supposed to be.

The in between systems may be divided into two categories: one uses GPS

information, the other use infrared or ultrasound transceivers.

The key problem in a navigation system is to determine where the user is located,

which then can be converted to the coordinates of a local GIS database to get the optimal

path. Most current systems use GPS for this task. GPS is a worldwide radio-navigation

system formed from a constellation of 24 satellites and their ground stations. It is

normally accurate up to meters.



T

,















Figure 3-2 Location Guidance System, Hideo Makino etc.


Loomis was one of the first to propose the idea of a navigation system for the blind

using GPS and acoustic information. In the 1990s, he built a navigational system for the

blind using DGPS with an FM correction data receiver for the stable determination of the

location of the traveler [18]. Hideo Makino et al. developed a system using GPS










information in two basic units in 1997. The first is the mobile unit for the blind traveler,

and the other is a base station for processing coordinates received from the traveler

through the mobile telephone and offering geographical information back to the traveler.

The error is 16 meters maximum. This system is illustrated in figure 3-2 [17], above.

The GPS signal is affected mainly by the deliberate degradation of the signals, called

selective availability (SA). To solve this problem, the obvious way is to increase the

number of satellites available. GLONASS (GLObal Navigation Satellite System) is the

Russian equivalent to GPS [19]. It has 19 operating satellites and is not affected by SA.

In stand-alone mode GLONASS is accurate to +20 m. The stand-alone accuracy is about

10% better than GPS. Other solution may be Differential GPS (DGPS). DGPS receivers

adopt two receivers communicating by a radio data link. One base receiver has fixed and

known coordinates while the other is mobile. Errors in the signals arriving at the base

receiver are computed and are used to correct the signals at the mobile one. The above

location guidance system made by Hideo Makino uses the DGPS receiver.

Although the accuracy of GPS or the combination of DGPS and GLONASS can reach

cm level in some applications, this method does not work well for urban areas, where

GPS signals are interrupted by moving vehicles, or are blocked by tall buildings, highway

bridges or big trees. It will also not work indoors. There are many other ways to support

navigation in areas where GPS information is hard to get. Some use active badges,

beacon architectures or ceiling-mounted infrared transceiver system installed in the

building [20, 21]. This approach requires a great deal of effort and expense to modify

buildings. In [20], each transmitter and receiver used for position sensing is built into the

buildings like Malls, auditoriums and conference halls. Each transmitter emits a unique










ID number to the environment. Once the user passes the space with a transmitter built in,

the receiver picks up the IDs from the IR transmitters and sends the information to the

wearable computer to compute the accurate position of the user. This system also has a

sparse 4-by-4-stimulator array that delivers directional cues by means of the sensory

salutation phenomenon. The flaw of this technique is that when the user passes the

location with the transmitter quickly, the IR signal from the transmitter may not be

accepted by the receiver properly, and the user might receive the wrong location

information.

A variant proposal has three parts [21]: The first part is a mainframe computer that

contains the database of all the IDs of the transmitters and is on all the time. It can direct

the user about location and path. The second is a series of built-in transceivers and

sensors connected to the mainframe computer, which can send the information of the IR

signals to the mainframe. The third part is a headset that allows the user to communicate

with the mainframe computer. This headset can also emit infrared light that can be

detected by the transceivers and the information can be sent to mainframe via high speed

Ethernet to locate the user.

3.2 Outdoor Version of Drishti Navigation System

3.2.1 System Design

The outdoor version of Drishti done by Steve Moore [2] is a navigation system for

visually impaired people that use a DGPS system to obtain the user's outdoor location.

The primary goal of this system is to augment a visually impaired person's pedestrian

experience with enough information so that they feel comfortable and at ease walking

outside, even in an unfamiliar environment.









Figure 3-3 is the client/proxy/server architecture of Drishti. The server and client

manager are developed using Java. The mobile computer serves as a client to the DGPS

server, which takes the user's voice input and obtains the accurate location. It also

communicates with GIS database to get the optimal route or contact the Police

department, etc., if needed.


VoiCIe Voi ce
In I 0"ii


I 111 ""1 n



SRoute >ii .


information in longice and latitude from the GPS receiver via serical port input and






Managers, one residing on the client side and the other on the server's side. If it is on the
\ \ 1 \ 1 1 \ 1 1 I I f l t 1 17a n t




Figure 3-3 Client/Proxy/Server Architecture of Drishti

Because the size of the message is small, the system takes advantage of the User

Datagram Protocol (UDP) sockets' low overhead and avoids the delay of the

Transmission Control Protocol (TCP) to divide a message into packets and reassemble it

at the other end. The client manager residing in the wearable computer gets geometric

information in longitude and latitude from the GPS receiver via serial port input and

passes these GPS coordinates to the Navigation Manager. There are two Navigation

Managers, one residing on the client side and the other on the server's side. If it is on the




























Figure 3-4 Mobile Client Components Interaction

client's side, each GPS coordinate will be displayed in the Path Viewer, which is built

using Java2D classes and allows users to view their current location and environment

feature and route information; if it is on the server's side, the Sender sends the location

object to the server one by one. The server listener receives the GPS coordinates as one

current location object and places it in the Navigation Manager (NM) queue where it

waits for processing. The NM is a thread that will continually attempt to remove the

coordinate packet from the queue and process it. Because the GPS coordinate object is

updated every second, NM does not provide navigation prompts for each location. The

DGPS Listener has one method for marking the next location object to be spoken to the

user. The NM has a route object that contains the route the client is currently on. The NM

asks the Route for prompt information. Prompt information contains the user's current

location and direction along the route. This prompt is wrapped as one object and sent

back to the client, where it can be spoken out by the speech synthesizer to inform the










user. Figure 3-5 shows a sample voice prompt of one route.


SStarting from Computer Science
STurn left on to Hub Walkway 2
STravel on Hub Walkway 2 for 79 feet
STurn left on to Stadium Road Walkway
STravel on Stadium Walkway for 225 feet
STurn left into stop #2
SStarting from stop #2
STurn right on to Stadium Road Walkway
STravel on Stadium Rd. Walkway for 225 feet
SContinue straight onto Hub Walkway
STravel on Hub Walkway for 81 feet
STurn left onto Black Hall Walkway
STravel on Black Hall Walkway for 111 feet
STurn right into Mathematics


Figure 3-5 Sample Voice Prompt of a Route

Given the current location and the destination, the system should return the optimal

route. But for the visually impaired person, the optimal route does not necessarily mean

the shortest route, because he/she may care more about safety. It is not uncommon for the

shortest route to involve crossing roads; stairways or ramps that are not convenient for

visually impaired people. One of the most important outcomes of Drishti compared to

other systems is that Drishti can deliver the landmark information along the blind

person's path in real time, warn about potential hazards, generate routes preferable to the

user, re-route the user if the current route is not available, if the sidewalk is under

construction or if the user changed his/her mind to go to another place and add notes to

the GIS database in the system for future processing in this case. This GIS database is

made available to various campus departments like the University Police, the Physical

Plant and Special Events so that they can insert and remove dynamic obstacles. The client

has a ListBrowser that can provide the user with known building names. Then the











FromToListener is activated by saying "route". It will ask the user to say his or her

starting place at the prompt "from," and it expects the user to say his or her destination

after it prompts "to". Then it will send the request to the server, asking for routes. The

user can also request the addition of new information by saying "add place" or "add end

point." Drishti downloads these place names from the GIS database and writes them in

JSGF to a StringBuffer, which can be loaded as a grammar and activated. The new rule is

added to the format of grammar and Drishti can understand the new places. Whenever the

user requests a route, Drishti presents the optimal route from the current location to the

destination according to the latest road information. Figure 3-6 displays the browse list

and communication between the user and Drishti.

The GIS is adopted to provide a spatial database of the environment, to inform the

user if he/she is close to the building or needs to cross a speed bump or some stairs. It is

accessed via a wireless network. Drishti obtained the GIS dataset for the UF campus from

UF's physical plant division. The scale of the dataset is a critical factor in navigation

systems. The systematic error of the current GIS layers is 2 meters. Drishti accounts for

the error while determining the user's current location.

As stated above, Drishti performs very well for outside navigation, but because of the

attenuation of GPS signal due to buildings, trees or bridges, this system needs to be

augmented to make it work for urban areas and especially for indoor navigation. This is

the motivation for the integrated indoor/outdoor navigation system, in which the

Hexamite ultrasound location system is adopted for indoor location. We changed the way

Drishti communicated between user and system by adding another server to collect the











coordinates via Hexamite system. I will talk in detail about this combined new system in

the following chapters.

User > "where can I go"
Drishti > "known buildings are", "Little", "Music", "Tigert"
User > "more"
Drishti > "Computer Science Engineering", "Matherly"
User > "departments"
Drishti > "known departments are" "Mathematics", Journalism"
User > "more"
Drishti > "Computer Science", "Forestry", "end of list"
User > "Stop"
Drishti > "ok"
User> "route"
Drishti > "from"
User > "Mathematics"
Drishti > "did you say Mathematics"
User > "yes"
Drishti > "to"
User > "Computer Science"
Drishti > "did you say Computer Science"
User > "yes"
Drishti > "ok, and away we go"


Figure 3-6 User Browses List of Available Destinations and Requests a Route

3.2.2 COTS Hardware And Software

Drishti uses some Commercial-Off-The-Shelf (COTS) hardware and software,

including Trimble PROXRS, a 12 channel integrated GPS/Beacon/Satellite receiver with

multi-path rejection technology, to receive GPS signals, and an XYBAUNAUT wearable

computer for client request processing. The prototype weighs approximately 8 lbs. which

is considered acceptable by most blind and disabled persons. The wearable computer as

well as the GPS receiver is placed in the backpack. An integrated headset has an

earphone and microphone that are used to give vocal commands and to query and receive






28


route instruction, obstacle prompts and geometry information. IBM viaVoice interface is

used as a vocal tool for the user and server communication.

Drishti also uses ESRI's COTS software, ArcView, to make spatial databases of

sidewalks and ArcSDE for database management and route storage. The Network

Analyst in the ArcView can generate the least-cost routes through a network.















CHAPTER 4
THE INTEGRATED INDOOR/OUTDOOR DRISHTI

4.1 System Architecture


Figure 4-1 Wearable Mobile Client

This thesis extends the outside version of Drishti to a complete navigational system by

integrating an indoor position system. The Hexamite low cost positioning device is used










to locate the user in indoor environments. The only things added on to the load of the user

are two ultrasound transceivers that are smaller than a credit card and can be tagged onto

the user's shoulder using Velcro. Figure 4-1 depicts a user with all the equipment on a

test run. Smart home is taken as an example to describe how this whole system works.

The architecture is displayed in the following figure 4-2.




T *rDrishtiHexamit --- i L
Voice In Voice Out Indoor LocSr System
Bundle







III

ViaVoice Spatial Database Pic
I Police
Engin
II

COMM I Traffi
IAPI:- ORDBMS
API:
Win98 DGPS ,+- Physical
Wearable Plant
UNIX I Plant
I UNIX





Figure 4-2 Integrated Indoor/Outdoor/Client/Proxy/Location Server/Architecture

The client communicates with the user via the headphone and microphone, enabled by

IBM COTS software viaVoice. The user communicates with the microphone using the

commands defined in the system grammar, making queries about his/her location, asking

for route and obstacle prompts. If the user is outside, the client has two ways to get the

location: one is through the Navigation Manager on the server side, which processes the

coordinates and returns prompt information about the location; the other is through the











Navigation Manager local to the client, which gets the coordinates directly from the GPS

receiver and checks the current location status in the user's path, which was first put in

the client when the user asked for a route. Because this process is done locally it runs

fast.

The client Navigation Manager piles up all the requests in a queue and ask the Sender

to send these request objects to the server. The server has a ClientListener that listens for

requests from the client all the time. Once it receives a request, it will forward the request

to a different queue according to the request type. Then the ClientServer, that is a server

proxy dealing with all the requests from the client, asks different task managers to finish

the requested tasks. If the request asks for a route, the server InfoReqManager gets the

starting point and final destination and asks the SDEClient to get the path and puts it in

the reply queue until the InfoSender sends it back to client. The client has InfoListener,

LocationListener, fromtoListener, all listening to the server at all times. The InfoListener

gets the packet and asks VocalView to speak to the user.

If the user moves indoor, he/she can change the navigation mode to indoor by saying

"Room" or "indoor" to Drishti. Then the user can ask a lot of information about the room

and the layout of the furniture. If the request asks for the current location, the

ClientServer asks InfoReqManager to get request object from the queue and send it to the

SdeClient, that is a client sitting in the server proxy to connect to the ArcSDE server.

SdeClient connects to both SDE server and Hexamite server. After the SdeClient gets the

coordinates from the Hexamite server, it sends a query to the SDE server and gets the

current location. The result is wrapped in a reply object and put in the reply queue. The






32


server has an InfoSender, which picks up the object from the queue and sends it back to

the client.

4.2 Interactions of Components

4.2.1 Client

The client is composed of three main parts: a vocal interface, a GPS receiver and

communication. Each part contains many functions or files as displayed in the following

figure.


-------------------- ---- -

Human Voice
SSendr Information Directions
i Sender -
Grammar Listener Listener
I Sender

Recognizer
RecognizerVocal DGPS Navigation
S Listener Listener Manager
Synthesizer

S -----------------------------------
VocalView Serial Port input

P-------------------- - - - - --

Synthesized DGPS receiver
Voice




Figure 4-3 Clients Three Components: Vocal Interface, DGPS Receiver and
Communicator

Vocal interface exploits the IBM COTS software viaVoice, which can understand

what the user asks and talks to the user to reply to the user request. The Vocal interface

can be programmed using javax.speech package. This package contains a recognizer, a

synthesizer and a rule defined by the designer in a grammar. A Recognizer provides










access to speech recognition capabilities. The primary capabilities provided by a

recognizer are grammar management and result handling. A Grammar defines a set of

tokens (words) that may be spoken and the patterns in which those tokens may be spoken.

We are using RuleGrammar format. RuleGrammar interface describes a Grammar that

defines what users may say by a set of rules. The rules may be defined as rule objects that

represent the rule in a data structure or as defined in the Java Speech Grammar Format

(JSGF). The format of the rules we made are shown below:

grammar fromto;

public = where can i go {wherego}

I places {wherego}

I destinations {wherego}

I where am i {whereami}

I location {whereami}

I how are you {howareyou}



When a grammar is active, the recognizer listens for speech in the incoming audio that

matches the grammar. When speech is detected, the recognizer produces a result. The

result object is passed to the application and contains information about which words

were heard.

The primary function provided by the Synthesizer interface is the ability to speak text,

speak Java Speech Markup Language text, and control an output queue of objects to be

spoken. A Synthesizer is created by a call to the Central.createSynthesizer method. The

default voice is male, and the language is English, which can be modified by the

designer. In this project, we define a VocalView, which can speak plain text or String in

JSML format.









The Differential GPS receiver is connected to the serial port com2 on the wearable
computer and is configured to output a NMEA 0183 sentence, which is an ASCII string
that contains Global Positioning Fix Data. The format of the sentence is as follows:

$GPGGA,hhmmss,xxxx.xx,a,yyyyy.yy,a,x,xx,x.x,x.x,M,x.x,M,x.x,xxxx*hh

in which hhmmss is the UTC (Coordinated Universal Time) of the position in hours,

minutes and seconds, xxxx.xx,a is the latitude, North/South and yyyyy.yy,a is the

longitude, East/West.

FromtoListener is the trigger of the client control, which calls VIClient to start various

function calls according to the different requests proposed by the user. Once the user

makes a request, a new result object is created when the recognizer detects an incoming

speech that may match the grammar activated when the client first started. Once the

recognizer completes recognition of the Result that it chooses to accept, it finalizes the

result with a RESULT ACCEPTED event that is issued to the ResultListeners attached

to the Recognizer, matched Grammar, and the Result. The VIClient is invoked to perform

different functions according to the accepted tokens that are expressed as String.

The VIClient is the core of the client functions. All the managers and listeners are

implemented as new threads. Many first-in-first-out queues are initialized which include

the direction queue and coordinates queue for the local Navigation Manager if the

navigation mode is set to local. The information request queue is also made here. The

Sender is initialized to send requests from the queue. The direction listener and

information listener start listening to the predefined port for the incoming reply object.

Different functions are implemented by wrapping different requests in different packet

headers. Each packet header identifies the type of request and packet body describing the

detailed request. These packets are inserted into the queues where they wait to be sent to

the server by the Sender.










The Sender is one way of the communication bridge between client and server. It uses

the UDP (User Datagram Protocol) communication protocol. The Sender is continuously

working, filling up the packets by removing objects from the queue and sending it to the

server. If the object is an information object, the Sender will wait for an

acknowledgement from the server before it sends out the packet to make sure the server

is activated and the packet will not be lost. If the object is a coordinate object, the Sender

will send it without asking for acknowledgement because the coordinate object is updated

every second and is continuously changing.

The other feature of the server/client communication bridge is its various listeners.

The DGPS Listener is a thread that registers as a serial port event listener that can be

notified once there is input (a byte stream) from the DGPS receiver via the serial port.

The listener parses the byte stream to get a (latitude, longitude, fix quality) tuple, and

then creates a new coordinate object. This object is passed to the Sender, waiting to be

sent out to the Navigation Manager on the server side if the user sets the navigation mode

to be "setserver." This object will be passed to the Navigation Manager on the client side

if the navigation mode is "setlocal". The latter can only be done after the server puts the

route object back to the local Navigation Manager. Each coordinate is shown on the Path

Viewer, which is a small panel for the visually impaired person to check his current status

on the path. The Vocal View speaks out Navigation.

There are two direction listeners implemented by a thread. One waits for the

infoObject from the server about the navigation prompt. The other one works locally,

looking for the direction queue and unwrapping the infoObject from the queue to get the

local navigation prompt.









Information Listener is a thread that always listens to the server for all the information

objects except for navigation prompts. The information objects are extracted from the

ByteArrayInputStream and the reply for the request is spoken to the user via VocalView.

4.2.2 ClientServer Proxy


S take

Direction Sender

UDP
- - - -


Figure 4-4 Client Manager Architecture

Figure 4-4 illustrates the client manager architecture that manages the client server

communication. In the server proxy, clientlistener is implemented using a thread that

continuously listens for requests from the registered client address and extracts the

incoming object according to different package header type. There are three kinds of

packages (requests): information objects that will be handled by the Information Request

Manager (IRM), GPS coordinate objects that will be handled by the Navigation Manager















(NM) to provide navigation prompt, and register objects that register the client address to

the server. To make sure the information object is received by the server and since this

kind of object is not time sensitive, it needs acknowledgement. The GPS coordinate

object comes in at a very fast pace and is continuously changing, so it is not necessary to

receive acknowledgement. The clientlistener puts the information object and GPS

coordinate object into different first-come-first-serve queues for the information manager

and navigation manager to use.

How the GPS coordinate object is handled and how the Navigation Manager works are

explained in Chapter 3 and figure 4-5. Here, I will describe how the indoor part of the

server works. How the location server proxy is bundled and how it works will be

illustrated in the next chapter.

There are many kinds of information requests depending on the different queries the

user makes. The IRM is implemented as a thread that continually takes the requests from

the information queue. If the queue is empty, the IRM will keep waiting. Once a request

comes up, say, a current location request, IRM will ask the VISDEClient to process the

request and wrap the reply in an information object and put it in the reply queue from

where the Sender will pick up the information object and send it to the client. The process

is illustrated in the following flowchart.



































Figure 4-5 Work Process of DrishtiIndoor

The VISDEClient needs two pieces of information to finish the user's request for his

or her current location. The first piece of information is the coordinate of the user and the

second one is the relation of these coordinates to the indoor facility. To get the

coordinates, VISDEClient asks HexClient to communicate with the Location System

Bundle, which throws eventsabout the coordinates twice every second. This bundle sits in

a different server, and is connected to the Hexamite ultrasound location system. I will

explain this in the next section. With these coordinates, VISDEClient asks SDEClient to

communicate with SDE (Spatial Database Engine) server. Then com.esri.sde.client java

API is used to compute the relationship between the shapes from the SDE and the shapes

made from the known coordinates. Different shapes are retrieved from the database and

compared with the shape, which is a buffer with diameter 0.1 foot centered at the










coordinate. If this shape is contained in the room shape, we can say the user is in that

room. If this shape is within a specific distance to the furniture shape or room shape

boundary, the system will prompt the user saying he/she is too close to the furniture or

wall.

If the request is about how to get to a place, for example, a room, we need some

geometric calculation because SDE java API cannot satisfy the request. The Hexamite

location system gives out the user's coordinates as well as the orientation. We can use the

orientation and the layout of the room plan to calculate the angel at which the user should

turn and the distance ahead as illustrated below in Figure 4-6.


6R* rw rnkte~n3--6D
"IRoert~htm"b(m0
oAe ntalion)+ "dez r e".
abe itCroriuntal kwc Q

%+ 1kin";r
else Iw*Uraheacr;


MAoYA


bay -y

* axi-VY.

ulrcF~lr*Ils"6~


a. V


Figure 4-6 Example of Geometric Calculation of Orientation

As displayed in the above figure, the user can ask the system for directions to the

desired destination, and the system may tell him/her the angle at which he should turn,


--


io.b.






40


the distance he has to travel, or it may correct the user's orientation along the way and

guide the user step-by-step to the destination.














CHAPTER 5
LOCATION SERVER

The location server for the Drishti is made up of two components. One is the

Hexamite location system, which uses ultrasound devices for high-resolution tracking

and guidance, developed by an OEM company called "Hexamite" in Australia. The other

is the Drishti location server proxy, which uses Open Services Gateway Initiative (OSGI)

to bundle the custom service software of the Hexamite system to provide the Drishti

server with indoor location of the user.

5.1 Hexamite Location System

5.1.1 Hardware Components

The Hexamite Ultrasound Location system consists of at least two or more Hexamite

Positioning Devices, where one device knows the distance of another. The device that

knows the distance to the other is called the pilot while the other device is called the

beacon, as shown in Figure 5-1. In this project, the HE900M is the pilot and HE900T is

the beacon. The system computes the distance between the pilots and beacons based on

the time difference of the ultrasound traveling in-between them and the travel speed of

ultrasound. The third part of the system is the RS485/RS232 converter, which connects

all the pilots to the central computer.

The HE900T scans for ultrasonic activity; if nothing is detected, the device goes

progressively into a deeper and deeper sleep mode that saves power. It comes with an

internal rechargeable Manganese Dioxide Lithium battery, which allows the user to wear

these beacons without wire. The HE900T can be fully charged through the pin










4(negative) and pin 8(positive) provided on the back of the device after 10 hours and can

be discharged (used) for 10 hours. We attach two HE900Ts on the shoulder of the user to

find out his or her location (coordinates) and orientation.

















HE900T(beacon) HE900M(pilot)

Figure 5-1 Hexamite Location System Devices

The HE900M is mounted on the ceiling facing the center of the house. It is connected

to a RS485 network. The network is plugged into the serial port of the central computer

through a RS485 to RS232 converter. The detection angle for the HE900M is 130

degrees at a distance of 6 meters. At a distance of 8 meters, this angle is 75 degrees. The

detection range can be up to 16 m, and the maximum resolution is 0.3 mm. Any one of

the pilots can be configured as the master. The master initiates the timing or distance

acquisition cycle of the whole system by transmitting a synchronization signal at the

beginning of the cycle. At the end of the cycle, the pilots transmit their positioning data

one after another over the serial network. The location system can consist of a limitless

number of Hexamite Positioning Devices configured as pilots and beacons to form a

large system to achieve better precision of location. In this project we have four HE900M









pilots. We set one at every corer of the smart house to give 360 degrees of coverage as

represented in Figure 5-2.


Figure 5-2 Hexamite Ultrasound Location System Coverage of the Smart House

5.1.2 Hardware Configuration

The Hexamite Positioning Device can be configured through its serial port. It can be

operated by and connected directly to a personal computer. The software in the central

computer communicates with the hardware, calculates the position and orientation and

creates location events for this location server. It first sends an escape control character to

ready the devices for commands; then it reads the Settings.txt and sends the setup strings

to the HE900Ms. If the setup configuration is feasible, the HE900M will configure











themselves and return a "+". After the custom software receives a "+" a carriage return

(OD) is sent to the hardware to start an acquisition cycle. This configuration initialization


is done in class Locaion.java. Figure 5-3 illustrates our sample Settings.txt file.


*************************** If device is a Beacon *******************
Pr :--.jL i 0 = if set Beacon is Air .,h .:..:-d
ProgramCtrl,l = if set Beacon is y,- "--i,. --;-1 via Network
ProgramCtrl,2 = if set Beacon is Synchronized via Port Pin
ProgramCtrl,3 = Must be 0
ProgramCtrl,4 = Must be 0
Pr :.. L r: r ,i, 5 = RESERVED
ProgramCtrl,6 = RESERVED
PL :,..IL inC L1,7 = if set Device is Beacon
***************************** If device is a Pilot *******************
IFLT.* L Ci li,0 = if set this Pilot transmits sonic synchronization
PI -..-:; ,;.. ,l = if set this Pilot is I/O pin synchronized
ProgramCtrl,2 = if set this Pilot's Autotracking is Enabled
ProgramCtrl,3 = if set this Pilot transmits continuously
ProgramCtrl,4 = if set this Pilot is an initiator in a chain of
pilots
ProgramCtrl,5 = RESERVED
PL :- .j L C- L 1.-,6 = RESERVED
ProgramCtrl,7 = if cleared Device is a Pilot

if the first letter in the line is # the program will transmit the
following setup
string onto the LPS network

#Q 01 OD 03 01 06 00
#0 01 50 03 01 06 00
#S 01 4E 03 01 06 00
#Y 11 52 03 01 06 00



Figure 5-3 Settings.txt for Hardware Configuration

Once the Settings.txt file is found, the custom software transmits the string following


the ASCII character # through the central computer's serial port for the hardware


configuration. This system consists of 6 devices, four of which are configured as pilots


and two as beacons. The pilots are linked via the serial ports. The beacons are mobile and


out of reach, so they are not linked.


The first part of the string is the device address (DA). Q, O, S and Y are the


secondary addresses of the four HE900Ms; these secondary addresses are the letters


before its primary address. The primary addresses on the pilots for this system are P, N, R


and X that correspond to the decimals 82, 80, 81 and 88 respectively. The second part of











the setting line is Programming Control Byte (PCB), which configures the three slaves

and one master. The slave is set as 01 while master is 11. A PCB of 11 corresponds to an

integer representation of 17 that sets bits 4 (24) and 0 (20) as displayed in Figure 5-3. The

slave should be configured before the master. They all transmit sonic synchronization. Of

the four pilots, the pilot closest to the beacon is the one that synchronizes the beacon. The

process is that the beacons send ultrasonic signal to the pilots at a predefined time

interval, and after receiving the signal; the master pilot creates a distance string and

passes the string to the next pilot configured by the termination byte. The third part is the

Termination Byte (TB). If the termination byte of device 'X' is the primary address of

device 'Y' on the same network, then device, 'Y' will be prompted to transmit its position

acquisition result once device 'X' has completed it's transmission of it's results. The last

device in the setting chain has a TB "Carriage return" (OD). The TB of master device is

X, or hex decimal number 52, which indicates pilot R (secondary address S), so that after

X finishes transmitting its distance string, pilot R will follow, then N (secondary address

O), and then P (secondary address Q). P will call for another cycle by setting its TB as

OD. We also call pilots X, R, N and P pilot 1, 2, 3 and 4 respectively in Figure 16. The

fourth part of the string is the Number of Beacons Byte (NBB), which is one more than

the number of beacons the pilot looks for during the position acquisition cycle. So in this

system, pilots are looking for 2 beacons. The fifth part indicates the number of pilots

from which the beacon receives messages. In this system the number is 1. The sixth part

is the number of devices in the system. In our case it is 6. The remaining part is about

beacons and is neglected.










5.1.3 Distance String

The distance string is attached one by one from master pilot to slaves after the

following transmission cycle:

-0E22 0123 OC34 ROB33 OE13 23A1 N03AB 3345 09C3 P0202 OFE3 00F2

The hyphen in front of the string means the device enters the position acquisition

cycle. OE22 is the distance between the master pilot and the nearest obstacle. The scalar

of slave begins with the name of the pilot followed by the distance between that slave and

the master, like ROB33, N03AB and P0202. 0123 is the hex decimal distance from the

master pilot 1(pilot X) to the first beacon which is 291 mm. OC34 is the distance from

pilot X to the second beacon which is 3124 mm. OE13 is the distance from pilot R to the

first beacon, etc. We get all the distances in one cycle and choose the shortest distances to

the beacons and calculate the distance using the following formula:

shortest distance to beacon #1=shortest distance to beacon #1 /2
next shortest distance to beacon #1=Next shortest distance to beacon #1- shortest distance to beacon #1
shortest distance to beacon #1 = 0.688 shortest distance to beacon #1
next shortest distance to beacon #1 = 0.688 next shortest distance to beacon #1


The formula for beacon #2 is also the same. The first time, we divide the shortest

distance to the beacon by 2 because each of the beacon's time must allow sound to travel

from the pilot to the beacon and back; or two times the maximum defined range of the

system. When we calculate the next closest distance to beacon, we subtract the shortest

distance to the beacon from the original next shortest distance. This is because of the way

pilots ping the beacons. At the beginning, all the pilots send out ultrasound at the same

time to ping the transceivers (beacon in our case). The beacons only accept the first signal

no matter who sent it; then they time stamp the signal and broadcast this signal back to all











pilots. The pilots received this signal and record the time of arrival. So the time

difference is the time the accepted signal spent to travel from the pilot to the beacon and

the time the signal travels from the beacon back to the pilots. 0.688 is the resolution

coefficient of the timer that measures the time of flight. The timer increments every 2

microseconds. The beacon transmits organized signals, which help each pilot to

distinguish it from the others. This is also the reason the system tells the orientation of the

user. Because the obstacles on the way from the pilot to the beacon easily reflect

ultrasound, we choose the shortest distance and next shortest distance to calculate the

location to avoid error.

L22 FM2 = H2 (5-1)

L2 SM2= H2 (5-2)

FM+SM= FS (5-3)

FM = (L22 -L12 +FS2)/2FS (5-4)

H = SQUR(L22 -FM2) (5-5)






T

L2 Li


F 21---_________ S
M



Figure 5-4 Location Calculation Trilateral

FM and H can be used to get different coordinates in different scenarios. For example,

when the beacon is closest to pilot 1 and pilot 2, the calculation process may be

illustrated as in Figure 5-5.

















F T L2

M t T
M T----- T
H


L1


S
(0,0)



Figure 5-5 Location Calculation Example

In the above case, side L2 is the shortest distance and side L1 is the next shortest

distance. The line TM is perpendicular to the line FS and parallel to the X-axis. So it is

easy to see that the coordinates in this case can be expressed as (H, FS-SM) in the format

of (X coordinate, Y coordinate). The class Tag.java calculates coordinates in 8 cases.

Because the beacon has its own ID, we can distinguish the two beacons as left and

right to get the orientation after the coordinates of both beacons are available. We assume

the coordinates of the user's head can be deduced at the middle of the left and right point.

We can infer the following formula for the person's location in the format of (X

coordinate, Y coordinate). P represents the Person.

P. X = (P.left.X P.right.X)/2 +P.right.X (5-6)
P. Y = (P.right.Y P.left.Y)/2 + P.left.Y (5-7)









Figure 5-6 shows how to analyze the orientation when the user is within the range of

270 to 360 degree, which satisfies the condition that left.X > right.X and left.Y > right.Y.

There are 8 different cases in orientation analysis in this system.


left

orientation




right
right ,


when P.left.X > P.right.X and P.left.Y > P.right.Y

orientation = 360 a
b=a
b = Math.atan((P.left.X-P.right.X)/(P.left.Y-P.right.Y))
so we get
orientation =
360- Math.atan((P.left.X-P.right.X)/(P.left.Y-P.right.Y))

public Person getPosition (Person P)
{
//No matter which direction the person is facing
//his position can be calculated as following formula
P.X = (P.left.X P.right.X)/2+P.right.X
P.Y = (P.right.Y- P.left.Y)/2 + P.left .Y

return new Person(X, Y, orientation);
}


Figure 5-6 Orientation and Position Analysis P.left.X>P.right.X and P.left.Y>P.right.Y

5.2 OSGI Location Service

5.2.1 OSGI

OSGI stands for the Open Service Gateway Initiative, which is an open specification

for the delivery of multiple services over wide area networks to local networks and

devices. This specification is a Java-based application layer framework, as we can see

from figure 5-7, that gives service providers, network operators, device makers, and

appliance manufacturers a vendor neutral application and device layer APIs and

functions. This strategy enables virtually all emerging home networking platforms,

protocols and services to seamlessly inter operate with back end services, using existing

residential telephone, cable TV, or electrical wiring.









The layered architecture of OSGI (Figure 5-7) [23] has three components:

frameworks, bundles and services. Framework is the core of OSGI specification,

providing a common ground or environment to execute services from different vendors.

It provides a general-purpose, secure, managed Java framework that supports the

deployment of extensible and downloadable service applications know as bundles.


Qn Oorfwuni g Syte m,'

\ Drivev DtIv flv

Hardw*r.

Figure 5-7 OSGI Architecture

In the OSGI environment, bundles are the only entities for deploying Java-based

applications. A bundle is comprised of Java classes and other resources, which together

can provide functions to end-users and provide components to other bundles, called

services. A bundle is deployed as a Java Archive (JAR) file. JAR files are used to store

applications, and their resources in a standard ZIP-based Java file format. Bundles can be

installed to provide services for the user and removed once the service is no longer

required. Installed bundles can register a number of services that can be shared with other

bundles under strict control of the framework. The framework can manage the









installation and update of bundles in an OSGI environment in a dynamic and scalable

fashion.

A JAR file of a bundle

* contains the resources to implement zero or more services. These resources may be
class files for the Java programming language or other data files.

* contains a manifest file describing the contents of the JAR file and providing
information about the bundle.

* states dependencies on other resources like Java packages that must be available to
the bundle before its operation.

* sesignates a special class in the bundle to act as Bundle Activator. The frameworks
must instantiate this class and invoke the START and STOP methods to start or stop
the bundle.

A bundle may have following states as shown in the flow chart of Figure 5-8.

OSGI emphasizes the open environment for all different services. Everything is

service in this case. Applications in OSGI are a collection of services even though they

may come from different providers. Such OSGI services are defined by their service

interface and implemented as service objects. The purpose of the service interface is to

specify the semantics and the behavior of a service. The interface also makes the

implementation transparent to the user. There can be many service implementations for a

single service interface. This is one of the many beauties of OSGI. For example, a single

standardized service interface to control the PC camera is followed by many camera

vendors, while there may be many different camera brands and models. Service object is

a Java object of a class that implements the service interface. Service object is owned by

and run within a bundle; this bundle must register the service with the Framework service

registry so that the service's functionality is available to other bundles under control of

the Framework.


































---> Automatic transition

Explicit transition

Figure 5-8 State Diagram Bundle

5.2.2 Location Server with Indoor Location Service Bundle

The custom indoor location service software communicates with Hexamite hardware

to get the distance string, parses the string to obtain the shortest distance between the

beacons and pilots, and then does some geometric analysis to provide the user with the

location coordinates and orientation. These software files are packaged into an OSGI

bundle to provide a generic service for location information so that multiple users can

share the information simultaneously by registering to the service. Figure 5-9 illustrates

the scheme of this location service.










,'"Location "
S ServerBundle \
-_ / Drishtilndoor
ProxyA
Hexamikl.
Bundle \ Activator

S Event '
\' \,


Figure 5-9 Location Service Scheme

The way to bundle location services is based on the work done by Sree Kuchibhotla

[24]. There are three bundles working together to provide the user with the location

information, which can be shared by other bundle applications simultaneously if there are

acquisitions for location at the same time. The indoor location service bundle contains all

the custom location service software to return the Person object, which has the

coordinates and orientation, twice every second. This service is activated by the

Activator, which interacts with the OSGI Framework once the Framework user starts the

bundle. The Person objects are thrown as events to the EventBroker bundle that has a

thread listening to the event continuously. This EventBroker bundle acts like a

middleman, receiving the event and throwing it out to its receivers. Any bundle that has

registered to the EventBroker can be a receiver to grab this event. This is the virtue of the

EventBroker, that it can serve many requests at the same time. In our case, the









LocationServer bundle gets the events thrown from the EventBroker and listens to any

client request. Once a client communicates with the LocationServer, the client can get the

Person object. This communication is implemented using TCP.

Because of openness of OSGI, we can run many applications simultaneously on the

Framework of the same computer. For example, a blind person using the indoor location

system to guide him while walking in the smart house may ask the server to turn on the

TV or radio, which is supported by another application bundle running concurrently on

the server; he may ask Drishti the direction to get to the microwave and then ask another

service to open the microwave door and cook for him! Your can use your imagination

and add as many applications as you wish!














CHAPTER 6
SUMMARY AND FUTURE WORK

The previous chapters explain in detail the motivation, architecture, and

implementation of the integrated navigation system for blind or visually impaired people.

It also includes some related technologies used by this system.

6.1 Achievement and contribution

This thesis is an exciting attempt to combine the indoor and outdoor navigation system

into a complete system. It uses a lot of COTS software and hardware to provide blind

people with increased convenience in traveling and living, allowing them to have their

hands free while they do so. The equipment they need to wear is a small brick-sized

wearable computer, a GPS receiver, a microphone-speaker headset and two ultrasound

transceivers that are smaller than a credit card. The indoor component integrates

seamlessly with the Drishti outdoor system by just adding two ultrasound transceivers.

The user can transfer from the outdoor mode to the indoor mode by just saying "indoor"

or "room" when he enters the indoor environment. He can convert back to outdoor mode

once he goes outside by saying "navigate."

In the indoor mode, because this system is bundled as an OSGI service, the location

information can be shared by other services running on the same server simultaneously.

For example, the user may be monitored by the care-giving center in case of emergency,

or he may be use other services like opening the door automatically once he is facing the

door within some distance. Other applications can be added to the OSGI environment to

give the user more convenience without affecting the existing service.









6.2 Future Work

The working range of this system depends on the coverage of the wireless network. To

give the user a larger traveling range without the expense of wireless network installation,

we plan to replace the wearable computer with a mobile phone to communicate with the

server. This will also ease the burden of the user.

The system assumes that the user is standing and estimates the vertical position of the

beacons as an average person's height. To make this system more realistic, the system

may be augmented to 3D measurement by adding more pilots at the different height

levels of the house and putting more beacons on different parts of the user's body to get

the real position of the user, including standing, sitting, bending, and lying to detect falls.















REFERENCES


[1] Virtanen, Ari, 2002. "Navigation and Guidance System for the Blind."
Available from URL:
http://www.vtt.fi/aut/results/navi/navigationandguidancefortheblind.ppt.
Site last visited December 2002.

[2] Moore, S. E., "Drishti: An Integrated Navigation System for the Visually
Impaired and Dsabled." Master's thesis, University of Florida.

[3] Madry, S.; Colvard,C.; Lathrop, R., 2002. "A New Paradigm-The GIS
'Layer Cake.'" Available from URL:
http://crssa.rutgers.edu/courses/geomaticsinfo/sld004.htm. Site last
visited December 2002.

[4] Environmental Systems Research Institute. 2002. "ESRI, GIS and
Mapping Software." Available from URL: http://www.esri.com. Site last
visited December 2002.

[5] Brain, M.; Harris, T., 2002. "How GPS Receivers Work." Available from
URL: http://www.howstuffworks.com. Site last visited December 2002.

[6] Environmental Systems Research Institute. 2002. "ArcSDE: the Gateway
for GIS Data in a DBMS." Available from URL:
http://www.esri.com/library/brochures/pdfs/sdebro.pdf. Site last visited
December 2002.

[7] Herian/Hexamite Cooperative. 2002. "Hexamite Positioning Devices
Utilize Ultrasound for High Resolution High Repeatability
Multidimensional Multipoint Guidance and Tracking." Available from
URL: http://www.hexamite.com. Site last visited December 2002.

[8] Mann, S., 1998. "Wearable Computing FAQ." Available from URL:
http://www.Wearcam. org/wearcompfaq.html. Site last visited December
2002.

[9] Borenstein, J.; Koren, Y., "Obstacle Avoidance with Ultrasonic Sensors."
Robotics and Automation, IEEE Journal of, Volume: 4 Issue: 2, April
1988 Page(s): 213 -218.









[10] Ram, S.; Sharf, J., "The People Sensor: A Mobility Aid for the Visually
Impaired." Second International Symposium on Wearable Computers,
Digest of Papers, 1998, Page(s): 166 -167.

[11] Zelek, J., 2002. "The E. (Ben) & Mary Hochhausen Fund for Research in
Adaptive Technology For Blind and Visually Impaired Persons."
Available from URL:
http://www.eos.uoguelph.ca/webfiles/zelek/cnib2.pdf Site last visited
December 2002.

[12] Hu, H.; Probert, P., The Oxford Project and the GEC AGV, a Chapter in
Book "Advanced Guided Vehicles: Aspects of the Oxford AGV Project."
Page(s): 9-16, Editors S. Cameron and P.J. Probert, Oxford University
(Oxford, UK): World Scientific Press, 1994.

[13] Golding, A. R. and Lesh, N., Indoor Navigation Using a Diverse Set of
Cheap, Wearable Sensors." The Third International Symposium on
Wearable Computer, Digest of Papers, 1999, Page(s): 29-36.

[14] Feiner, S.; MacIntyre, B.; Hollerer, T.; Webster, A., "A Touring Machine:
Prototyping 3D Mobile Augmented Reality Systems for Exploring the
Urban Environment." First International Symposium on Wearable
Computers, Digest of Papers, 1997, Page(s): 74 -81.

[15] Mann, S., 1997. "Existential Technology of Synthetic Synesthesia for the
Visually Challenged." Available from URL:
http://eyetap.org/wearcomp/isea97/index.html. Site last visited December
2002.

[16] T. Chen and Shibasaki, R., "A Versatile AR Type 3D Mobile GIS Based
on Image Navigation Technology." IEEE International Conference on
Systems, Man and Cybernetics, 1999, Page(s): 1070-1075.

[17] Makino, H.; Ishii, I.; Nakashizuka, "Development of Navigation System
for the Blind Using GPS and Mobile Phone Combination." Engineering in
Medicine and Biology Society, 1996. Bridging Disciplines for
Biomedicine, 18th Annual International Conference of the IEEE, Volume:
2, 1997, Page(s): 506 -507.

[18] Loomis, J.M.; Golledge, R.G.; Klatzky, R.L.; Speigle, J.M. & Tietz, J.,
"Personal Guidance System for the Visually Impaired." Proc. 1s Ann.
ACM/SIGGAPH Conf. On Assistive Tech. Page(s): 85-91, 1994.

[19] Walsh, D.; Capaccio, S.; Lowe, D.; Daly, P.; Shardlow, P. and Johnston,
G., "Real Time Differential GPS and GLONASS Vehicle Positioning In









Urban Areas." IEEE Conference on Intelligent Transportation System,
1998, Page(s): 514-519.

[20] Kridner, C., "A Personal Guidance System for the Visually Disabled
Population: The Personal Indoor Navigation System (PINS)." Available
from URL:
http://vision.psych.umn.edu/www/people/legge/5051/PGS2.pdf Site last
visited December 2002.

[21] Ertan, S.; Lee, C.; Willets, A.; Tan, H. and Pentland, A., A Wearable
Haptic Navigation Guidance System." The Second International
Symposium on Wearable Computer, Digest of Papers, 1998, Page(s): 164-
165.

[22] Helal, A.; Moore, S.E.; Ramachandran, B., "Drishti: An Integrated
Navigation System for Visually Impaired and Disable." Fifth International
Symposium on Wearable Computers, Proceedings, 2001, Page(s): 149 -
156.
















BIOGRAPHICAL SKETCH

I got my degree of Master of Science on Chemistry in China in 1996 and worked in

the National Research Center of Certified Reference Materials for two years in Beijing,

China. Then I found I was attracted deeply by computer science, one of the most

promising fields in the world.

In 1999 I came to America and started my study in the department of Computer and

Information Science and Engineering in University of Florida in 2000. I immersed myself

in the wonderful knowledge. During the Christmas time of 2001, I went to see Dr. Helal

and began the work on the Drishti project. I really enjoyed the days and nights I stayed in

the Harris and ICTA labs because the more effort I put on the work, the more I gained to

increase my knowledge and experience.

Having Drishti as my stepping-stone, I aim to achieve greater heights in my coming

career.