Page 54 - Autonomous Mobile Robots
P. 54

Visual Guidance for Autonomous Vehicles                     37

                              progress along the road map is through testing sensors, systems, and algorithms
                              in the field; and then seeing what can survive the challenges presented.



                              ACKNOWLEDGMENTS
                              The authors would like to acknowledge the support of A∗ STAR and DSTA
                              (Singapore) in funding project activities that have contributed to the findings
                              presented in this chapter.



                              REFERENCES
                                 1. Committee on Army Unmanned Ground Vehicle Technology National
                                    Research Council. Technology Development for Army Unmanned Ground
                                    Vehicles. National Academy Press, Washington, 2002.
                                 2. J. Kuamgai. Sand trap. IEEE Spectrum, 41: 34–40, 2004.
                                 3. T. W. Fong, C. Thorpe, and C. Baur. Advanced interfaces for vehicle teleoper-
                                    ation: collaborative control, sensor fusion displays, and remote driving tools.
                                    Autonomous Robots, 11: 77–85, 2001.
                                 4. A. J. Davison. Mobile Robot Navigation Using Active Vision. PhD thesis,
                                    Department of Engineering Science, University of Oxford, Oxford, UK,
                                    February 1998.
                                 5. R. C. Gonzalez and R. E. Woods. Digital Image Processing. Addison-Wesley,
                                    Reading, MA, 2nd edition, 1992.
                                 6. N.ZengandJ.D.Crisman. Categoricalcolorprojectionforrobotroadfollowing.
                                    In Proceedings of 12th International Conference on Robotics and Automation,
                                    pp. 1080–1085, 1995.
                                 7. J. D. Crisman and C. E. Thorpe. Scarf: a color vision system that tracks roads
                                    and intersections. IEEE Transactions on Robotics and Automations, 9: 49–58,
                                    1993.
                                 8. C. Geyer and K. Daniilidis. A unifying theory for central panoramic systems and
                                    practical applications. In ECCV (2), pp. 445–461. Springer-Verlag, Heidelberg,
                                    2000.
                                 9. M. Bosse, R. J. Rikoski, J. J. Leonard, and S. J. Teller. Vanishing points and
                                    three-dimensional lines from omni-directional video. The Visual Computer, 19:
                                    417–430, 2003.
                                10. F. Tang, M. D. Adams, J. Ibanez-Guzman, and W. S. Wijesoma. Pose invariant,
                                    robust feature extraction from range data with a modified scale space approach.
                                    In Proceedings of IEEE International Conference on Robotics and Automation,
                                    New Orleans, LA, USA, April 2004.
                                11. T. C. Ng and J. C. Tan. Development of a 3D ladar system for autonomous
                                    navigation. In Proceedings of IEEE Conference on Robotics, Automation and
                                    Mechatronics (RAM 04), Singapore, pp. 792–797, 1–3 December 2004.
                                12. W. C. Stone, M. Juberts, N. Dagalakis, J. Stone, and J. Gorman. Performance
                                    analysis of next generataion ladar for manufacturing, construction and mobility.




                              © 2006 by Taylor & Francis Group, LLC



                                 FRANKL: “dk6033_c001” — 2006/3/31 — 16:42 — page 37 — #37
   49   50   51   52   53   54   55   56   57   58   59