Page 145 - Introduction to Autonomous Mobile Robots
P. 145

Chapter 4
                           130
                           point’s light rays on each camera’s image is depicted in camera-specific local coordinates.
                           Thus, the origin for the coordinate frame referenced by points of the form (x y, l  l  ) is located
                           at the center of lens  . l
                             From the figure 4.22, it can be seen that


                                         ⁄
                                                        ⁄
                                x l  x +  b 2  x r  x –  b 2
                                ---- =  -------------------   and  ---- =  ------------------   (4.25)
                                f      z       f      z
                           and (out of the plane of the page)

                                y l  y r  y
                                ---- =  ---- =  --                                           (4.26)
                                f    f   z

                           where f  is the distance of both lenses to the image plane. Note from equation (4.25) that

                                x –  x r  b
                                 l
                                -------------- =  ---                                        (4.27)
                                  f     z
                           where the difference in the image coordinates,  x –  x r   is called the disparity. This is an
                                                                  l
                           important term in stereo vision, because it is only by measuring disparity that we can
                           recover depth information. Using the disparity and solving all three above equations pro-
                           vides formulas for the three dimensions of the scene point being imaged:
                                                             ⁄
                                     ( x +  x ) 2     ( y +  y ) 2       f
                                            ⁄
                                          r
                                       l
                                                           r
                                                        l
                                x =  b--------------------------   ;   y =  b--------------------------   ;   z =  b--------------   (4.28)
                                       x –  x  r        x –  x r       x –  x  r
                                                                        l
                                        l
                                                         l
                           Observations from these equations are as follows:
                           • Distance is inversely proportional to disparity. The distance to near objects can therefore
                             be measured more accurately than that to distant objects, just as with depth from focus
                             techniques. In general, this is acceptable for mobile robotics, because for navigation and
                             obstacle avoidance closer objects are of greater importance.
                           • Disparity is proportional to  . For a given disparity error, the accuracy of the depth esti-
                                                   b
                             mate increases with increasing baseline  . b
                           • As b is increased, because the physical separation between the cameras is increased,
                             some objects may appear in one camera but not in the other. Such objects by definition
                             will not have a disparity and therefore will not be ranged successfully.
   140   141   142   143   144   145   146   147   148   149   150