Page 168 - Autonomous Mobile Robots
P. 168

152                                    Autonomous Mobile Robots

                                4.2 LANDMARK-BASED NAVIGATION
                                In a landmark-based navigation system, the robot relies on its onboard sensors
                                to detect and recognize landmarks in its environment to determine its position.
                                Thisnavigationsystemverymuchdependsonthekindofsensorsbeingused,the
                                types of landmarks, and the number of landmarks available. For instance,
                                Sugihara [9] used a single camera on a robot to detect the identical points in the
                                                               3
                                environment and then adopted an O(n lg n) algorithm to find the position and
                                orientation of the robot such that each ray pierces at least one of the n points in
                                the environment. An extended version was proposed in References 10 and 11,
                                respectively. The localization based on distinguishable landmarks in the envir-
                                onment has been researched in Reference 12, in which the localization error
                                varies depending on the configuration of landmarks. Apart from vision systems,
                                other sensors have been widely used in position estimation, including laser [3],
                                odometry [13], ultrasonic beacons [6], GPS [7], IR [12], and sonars [14]. Since
                                no sensor is perfect and landmarks may change, none of these approaches
                                is adequate for a mobile robot to operate autonomously in the real world.
                                A landmark-based navigation system needs the integration of multiple sensors
                                to achieve robustness and cope with uncertainties in both sensors and land-
                                mark positions. This motivates us to pursue a hybrid approach to the problem
                                by integrating multiple sensors and different kinds of landmarks in a unified
                                framework.
                                   In general, the accuracy of the position estimation in a landmark-based
                                navigation system is affected by two major problems. The first problem is that
                                the navigation system cannot work well when landmarks accidentally change
                                their positions. If natural landmarks are used in the navigation process, their
                                positions must be prestored into the environment map so that it is possible for
                                a mobile robot to localize itself during its operation. The second problem is
                                that sensory measurements are noisy when the robot moves on an uneven floor
                                surface or changes the speed frequently. The accuracy of robot positioning
                                degrades gradually, and sometimes becomes unacceptable during a continuous
                                operation. Therefore, re-calibration is needed from time to time and it becomes
                                a burden for real-world applications.
                                   To effectively solve these problems, we propose a novel landmark-based
                                navigation system that is able to:

                                     • Initialize its position through triangulation when necessary
                                     • Update its internal landmark model when the position of landmarks
                                      is changed or new landmarks become available
                                     • Localize the robot position by integrating data from odometry, laser
                                      scanner, sonar, and vision
                                   Figure 4.1 shows the block diagram of our navigation system that is able
                                to implement concurrent localization and map building automatically. It is




                                 © 2006 by Taylor & Francis Group, LLC



                                 FRANKL: “dk6033_c004” — 2006/3/31 — 16:42 — page 152 — #4
   163   164   165   166   167   168   169   170   171   172   173