Page 276 - Innovations in Intelligent Machines
P. 276
Toward Robot Perception through Omnidirectional Vision 269
71. R. Pless, Using many cameras as one, Proc CVPR, 2003, pp. II: 587–593.
72. D. Rees, Panoramic television viewing system, us patent 3 505 465, postscript
file, April 1970.
73. W. Rucklidge, Efficient visual recognition using the hausdorff distance,Lecture
Notes in Computer Science, vol. 1173, Springer-Verlag, 1996.
74. J. Shi and C. Tomasi, Good features to track, Proc. of the IEEE Int. Conference
on Computer Vision and Pattern Recognition, June 1994, pp. 593–600.
75. S. Sinha and M. Pollefeys, Towards calibrating a pan-tilt-zoom camera network,
OMNIVIS’04, workshop on Omnidirectional Vision and Camera Networks (held
with ECCV 2004), 2004.
76. S.N. Sinha and M. Pollefeys, Synchronization and calibration of camera networks
from silhouettes, International Conference on Pattern Recognition (ICPR’04),
vol. 1, 23–26 Aug. 2004, pp. 116–119 Vol. 1.
77. T. Sogo, H. Ishiguro, and M. Treivedi, Real-time target localization and track-
ing by n-ocular stereo, Proceedings of the 1st International IEEE Workshop on
Omni-directional Vision (OMNIVIS’00) at CVPR 2000, June 2000.
78. M. Spetsakis and J. Aloimonos, Structure from motion using line correspon-
dences, International Journal of Computer Vision 4 (1990), no. 3, 171–183.
79. P. Sturm, A method for 3d reconstruction of piecewise planar objects from single
panoramic images, 1st International IEEE Workshop on Omnidirectional Vision
at CVPR, 2000, pp. 119–126.
80. P. Sturm and S. Ramalingam, A generic concept for camera calibration, Proceed-
ings of the European Conference on Computer Vision, Prague, Czech Republic,
vol. 2, Springer, May 2004, pp. 1–13.
81. W. Sturzl, H. Dahmen, and H. Mallot, The quality of catadioptric imaging -
application to omnidirectional stereo, European Conference on Computer Vision,
2004, pp. LNCS 3021:614–627.
82. T. Svoboda, T. Pajdla, and V. Hlav´aˇc, Epipolar geometry for panoramic cam-
eras, Proc. European Conf. Computer Vision, July 1998, pp. 218–231.
83. R. Talluri and J. K. Aggarwal, Mobile robot self-location using model-image
feature correspondence, IEEE Transactions on Robotics and Automation 12
(1996), no. 1, 63–77.
84. G. Thomas, Real-time panospheric image dewarping and presentation for remote
mobile robot control, Journal of Advanced Robotics 17 (2003), no. 4, 359–368.
85. S. Thrun and A. Bucken, Integrating grid-based and topological maps for mobile
robot navigation, Proceedings of the 13th National Conference on Artifical Intel-
ligence (AAAI’96), 1996.
86. S. Watanabe, Karhunen-lo`eve expansion and factor analysis, Transactions of the
4th Prague Conference on Information Theory, Statistical Decision Functions
and Random Processes, 1965, pp. 635–660.
87. R. Wehner and S. Wehner, Insect navigation: use of maps or ariadne’s thread?,
Ethology, Ecology, Evolution 2 (1990), 27–48.
88. N. Winters, A holistic approach to mobile robot navigation using omnidirectional
vision, Ph.D. thesis, University of Dublin, Trinity College, 2002.
89. N. Winters, J. Gaspar, G. Lacey, and J. Santos-Victor, Omni-directional vision
for robot navigation, 1st International IEEE Workshop on Omni-directional
Vision at CVPR, 2000, pp. 21–28.
90. N. Winters and J. Santos-Victor, Omni-directional visual navigation, 7th Inter-
national Symposium on Intelligent Robotics Systems (SIRS’99), July 1999,
pp. 109–118.