Page 265 - Innovations in Intelligent Machines
P. 265

258    J. Gaspar et al.
                           The remainder of this section shows how to obtain a 3D reconstruction from
                           this information.

                           Using Back-projection to form Perspective Images

                           In this section, we derive a transformation, applicable to single projection
                           centre omnidirectional cameras that obtain images as if acquired by perspec-
                           tive projection cameras. This is interesting as it provides a way to utilize
                           methodologies for perspective cameras directly with omnidirectional cameras.
                           In particular, the interactive scene reconstruction method (described in the
                           following sections) follows this approach of using omnidirectional cameras
                           transformed to perspective cameras.
                              The acquisition of correct perspective images, independent of the scenario,
                           requires that the vision sensor be characterised by a single projection centre
                           [2]. The unified projection model has, by definition, this property but, due to
                           the intermediate mapping over the sphere, the obtained images are in general
                           not perspective.
                              In order to obtain correct perspective images, the spherical projection
                           must be first reversed from the image plane to the sphere surface and then,
                           re-projected to the desired plane from the sphere centre. We term this reverse
                           projection back-projection.
                              The back-projection of an image pixel (u, v), obtained through spherical
                           projection, yields a 3D direction k · (x, y, z) given by the following equations
                           derived from Eq. (1):
                                                                  2
                                                                      2
                                                 a =(l + m),b =(u + v )

                                                               2        2
                                            x  =  la − sign(a)  a +(1 − l )b u             (25)
                                            y              a + b            v
                                                            2

                                                                2
                                                    z = ± 1 − x − y 2
                                                        √
                           where z is negative if |a| /l >  b, and positive otherwise. It is assumed,
                           without loss of generality, that (x, y, z) is lying on the surface of the unit
                           sphere. Figure 17 illustrates the back-projection. Given an omnidirectional
                           image we use back-projection to map image points to the surface of a sphere
                                                       10
                           centred at the camera viewpoint .
                              At this point, it is worth noting that the set M = {P : P =(x, y, z)} inter-
                           preted as points of the projective plane, already define a perspective image.
                           By rotating and scaling the set M one obtains specific viewing directions and


                           10
                             The omnidirectional camera utilized here is based on a spherical mirror and there-
                             fore does not have a single projection centre. However, as the scene depth is large
                             as compared to the sensor size, the sensor approximates a single projection cen-
                             tre system (details in [33]). Hence it is possible to find the parameters of the
                             corresponding unified projection model system and use Eq. (25).
   260   261   262   263   264   265   266   267   268   269   270