Page 46 - Autonomous Mobile Robots
P. 46

Visual Guidance for Autonomous Vehicles                     29

                                 When sensor outputs are read asynchronously, certain assumptions such as
                              being Linear Time Invariant (LTI) [38] can be made to propagate asynchron-
                              ized data to the upcoming sample time of the control system. Robl [38] showed
                              examplesofusingfirst-orderholdandthird-orderholdmethodstopredictsensor
                              values at desired times. When different resolution sensors are to be fused
                              at the data level (e.g., fusion of range images from ladar and stereo vision),
                              down-sampling of sensor data with higher spatial resolution by interpolation
                              is performed. For sensor fusion at the obstacle map level, spatial synchron-
                              ization is not necessary since a unique map representation is defined for all
                              sensors.

                              Example: Fusion of laser and stereo obstacle maps for false alarm suppression

                              Theoretically, pixel to pixel direct map fusion is possible if the calibra-
                              tion and synchronization of the geometrical constraints (e.g., rotation and
                              translation between laser and stereo system) remain unchanged after calib-
                              ration. Practically, however, this is not realistic, partially due to the fact that
                              sensor synchronization is not guaranteed at all times: CPU loading, terrain
                              differences, and network traffic for the map output all affect the synchroniza-
                              tion. Feature-based co-registration sensor fusion, alternatively, addresses this
                              issue by computing the best-fit pose of the obstacle map features relative to
                              multiple sensors which allows refinement of sensor-to-sensor registration. In
                              the following, we propose a localized correlation based approach for obstacle-
                              map-level sensor fusion. Assuming the laser map L ij and stereo map S ij is to be
                              merged to form F ij . A map element takes the value 0 for a traversable pixel, 1
                              for an obstacle, and anything between 0 and 1 for the certainty of the pixel to be
                              classified as an obstacle. We formulate the correlation-based sensor fusion as


                                   
                                   L ij                       S ij = undefined
                                   
                                   
                                   
                                    S ij
                                                              L ij = undefined
                              F ij =
                                   (a 1 L ij + a 2 S i+m,j+n )/(a 1 + a 2 ) max(Corr(L ij S i+m,j+n ))  m, n ∈
                                   
                                   
                                   
                                    undefined                  S ij , L ij = undefined
                                                                                        (1.9)
                              where 	 represents a search area and {a 1 , a 2 } are weighting factors. Corr(L, S)
                              is the correlation between L and S elements with window size w c :
                                                     w c /2  w c /2

                                   Corr(L ij S i+m,j+n ) =       L i+p,j+q S i+m+p,j+n+q  (1.10)
                                                   p=−w c /2 q=−w c /2




                              © 2006 by Taylor & Francis Group, LLC



                                 FRANKL: “dk6033_c001” — 2006/3/31 — 16:42 — page 29 — #29
   41   42   43   44   45   46   47   48   49   50   51