Vision is an extraordinarily powerful sense. The ability to perceive the environment allows for movement to be regulated by the world. Humans do this effortlessly but we still lack an understanding of how perception works. Our approach to gaining an insight into this complex problem is to build artificial visual systems for semi-autonomous robot navigation, supported by humanrobot interfaces for destination specification. We examine how robots can use images, which convey only 2D information, in a robust manner to drive its actions in 3D space. Our work provides robots with the perceptual capabilities to undertake everyday navigation tasks, such as go to the fourth office in the second corridor. We present a complete navigation system with a focus on building – in line with Marr’s theory [57] – mediated perception modalities. We address fundamental design issues associated with this goal; namely sensor design, environmental representations, navigation control and user interaction.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
S. Baker and S. K. Nayar, A theory of catadioptric image formation, Proc. Int. Conf. Computer Vision (ICCV’97), January 1998, pp. 35-42.
A theory of single-viewpoint catadioptric image formation, International Journal of Computer Vision 35 (1999), no. 2, 175-196.
R. Benosman and S. B. Kang (eds.), Panoramic vision, Springer Verlag, 2001.
M. Betke and L. Gurvits, Mobile robot localization using landmarks, IEEE Trans. on Robotics and Automation. 13 (1997) 2, 251-263.
J. Borenstein, H. R. Everett, and Liqiang Feng, Navigating mobile robots: Sen-sors and techniques, A. K. Peters, Ltd., Wellesley, MA, 1996 (also: Where am I? Systems and Methods for Mobile Robot Positioning, ftp://ftp.eecs.umich. edu/people/johannb/pos96rep.pdf).
G. Borgefors, Hierarchical chamfer matching: A parametric edge matching algo-rithm, IEEE Transactions on Pattern Analysis and Machine Intelligence 10 (1988), no. 6, 849-865.
R. Brooks, Visual map making for a mobile robot, Proc. IEEE Conf. on Robotics and Automation, 1985.
R. A. Brooks, A robust layered control system for a mobile robot, IEEE Trans-actions on Robotics and Automation 2 (1986), 14-23.
A. Bruckstein and T. Richardson, Omniview cameras with curved surface mir-rors, Proceedings of the IEEE Workshop on Omnidirectional Vision at CVPR 2000, June 2000, First published in 1996 as a Bell Labs Technical Memo, pp. 79-86.
D. Burschka, J. Geiman, and G. Hager, Optimal landmark configuration for vision-based control of mobile robots, Proc. IEEE Int. Conf. on Robotics and Automation, 2003, pp. 3917-3922.
Z. L. Cao, S. J. Oh, and E.L. Hall, Dynamic omni-directional vision for mobile robots, Journal of Robotic Systems 3 (1986), no. 1, 5-17.
J. S. Chahl and M. V. Srinivasan, Reflective surfaces for panoramic imaging, Applied Optics 36 (1997), no. 31, 8275-8285.
P. Chang and M. Herbert, Omni-directional structure from motion, Proceed- ings of the1st International IEEE Workshop on Omni-directional Vision (OMNIVIS’00) at CVPR 2000, June 2000.
R. Collins and R. Weiss, Vanishing point calculation as a statistical inference on the unit sphere, Int. Conf. on Computer Vision (ICCV), 1990, pp. 400-403.
T. Conroy and J. Moore, Resolution invariant surfaces for panoramic vision systems, IEEE ICCV’99, 1999, pp. 392-397.
Olivier Cuisenaire, Distance transformations: Fast algorithms and applications to medical image processing, Ph.D. thesis, U. Catholique de Louvain, October 1999.
K. Daniilidis (ed.), 1st international ieee workshop on omnidirectional vision at cvpr 2000, June 2000.
——, Page of omnidirectional vision hosted by the grasp laboratory, http://www. cis.upenn.edu/∼kostas/omni.html, 2005.
P. David, D. DeMenthon, and R. Duraiswami, Simultaneous pose and correspon-dence determination using line features, Proc. IEEE Conf. Comp. Vision Patt. Recog., 2003.
A. Davison, Real-time simultaneous localisation and mapping with a single cam-era, IEEE Int. Conf. on Computer Vision, 2003, pp. 1403-1410 vol. 2.
C. Canudas de Wit, H. Khennouf, C. Samson, and O. J. Sordalen, Chap.5: Nonlinear control design for mobile robots, Nonlinear control for mobile robots (Yuan F. Zheng, ed.), World Scientific series in Robotics and Intelligent Sys-tems, 1993.
P. E. Debevec, C. J. Taylor, and J. Malik, Modeling and rendering architecture from photographs: a hybrid geometry and image-based approach, SIGGRAPH, 1996.
S. Derrien and K. Konolige, Approximating a single viewpoint in panoramic imaging devices, Proceedings of the 1st International IEEE Workshop on Omni-directional Vision at CVPR 2000, June 2000, pp. 85-90.
G. DeSouza and A. Kak, Vision for mobile robot navigation: A survey, IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (2002), no. 2, 237-267.
O. Faugeras, Three-dimensional computer vision - a geometric viewpoint, MIT Press, 1993.
Mark Fiala, Panoramic computer vision, Ph.D. thesis, University of Alberta, 2002.
S. Fleck, F. Busch, P. Biber, H. Andreasson, and W. Straber, Omnidirectional 3d modeling on a mobile robot using graph cuts, Proc. IEEE Int. Conf. on Robotics and Automation, 2005, pp. 1760-1766.
J. Foote and D. Kimber, Flycam: Practical panoramic video and automatic cam-era control, Proc. of the IEEE Int. Conference on Multimedia and Expo, vol. III, August 2000, pp. 1419-1422.
S. Gaechter and T. Pajdla, Mirror design for an omnidirectional cam-era with a uniform cylindrical projection when using svavisca sensor, Tech. report, Czech Tech. Univ. - Faculty of Electrical Eng. ftp://cmp.felk.cvut.cz/ pub/cmp/articles/pajdla/Gaechter-TR-2001-03.pdf, March 2001.
S. Gaechter, T. Pajdla, and B. Micusik, Mirror design for an omnidirectional camera with a space variant imager, IEEE Workshop on Omnidirectional Vision Applied to Robotic Orientation and Nondestructive Testing, August 2001, pp. 99-105.
J. Gaspar, Omnidirectional vision for mobile robot navigation, Ph.D. thesis, Instituto Superior Técnico, Dept. Electrical Engineering, Lisbon - Portugal, 2003.
J. Gaspar, C. Deccó, J. Okamoto Jr, and J. Santos-Victor, Constant resolution omnidirectional cameras, 3rd International IEEE Workshop on Omni-directional Vision at ECCV, 2002, pp. 27-34.
J. Gaspar, E. Grossmann, and J. Santos-Victor, Interactive reconstruction from an omnidirectional image, 9th International Symposium on Intelligent Robotic Systems (SIRS’01), July 2001.
J. Gaspar and J. Santos-Victor, Visual path following with a catadiop-tric panoramic camera, Int. Symp. Intelligent Robotic Systems, July 1999, pp. 139-147.
J. Gaspar, N. Winters, and J. Santos-Victor, Vision-based navigation and envi-ronmental representations with an omni-directional camera, IEEE Transactions on Robotics and Automation 16 (2000), no. 6, 890-898.
D. Gavrila and V. Philomin, Real-time object detection for smart vehicles, IEEE, Int. Conf. on Computer Vision (ICCV), 1999, pp. 87-93.
C. Geyer and K. Daniilidis, A unifying theory for central panoramic systems and practical applications, ECCV 2000, June 2000, pp. 445-461.
——, Catadioptric projective geometry, International Journal of Computer Vision 43 (2001), 223-243.
Gene H. Golub and Charles F. Van Loan, Matrix computations, third ed., Johns Hopkins Studies in the Mathematical Sciences, The Johns Hopkins University Press, 1996. MR 1 417 720.
P. Greguss, Panoramic imaging block for 3d space, US patent 4,566,763, January 1986, Hungarian Patent granted in 1983.
P. Greguss (ed.), Ieee icar 2001 workshop on omnidirectional vision applied to robotic orientation and non-destructive testing, August 2001.
E. Grossmann, D. Ortin, and J. Santos-Victor, Algebraic aspects of recon-struction of structured scenes from one or more views, British Machine Vision Conference, BMVC2001, September 2001, pp. 633-642.
Etienne Grossmann, Maximum likelihood 3d reconstruction from one or more uncalibrated views under geometric constraints, Ph.D. thesis, Instituto Superior Técnico, Dept. Electrical Engineering, Lisbon-Portugal, 2002.
E. Hecht and A. Zajac, Optics, Addison Wesley, 1974.
R. Hicks, The page of catadioptric sensor design, http://www.math.drexel. edu/∼ahicks/design/, 2004.
R. Hicks and R. Bajcsy, Catadioptric sensors that approximate wide-angle perspective projections, Proceedings of the Computer Vision and Pattern Recog-nition Conference (CVPR’00), June 2000, pp. 545-551.
A. Howard, M.J. Mataric, and G. Sukhatme, Putting the ‘i’ in ‘team’: an ego-centric approach to cooperative localization, IEEE Int. Conf. on Robotics and Automation, 2003.
D. Huttenlocher, G. Klanderman, and W. Rucklidge, Comparing images using the hausdorff distance, IEEE Transactions on Pattern Analysis and Machine Intelligence 15 (1993), no. 9, 850-863.
D. Huttenlocher, R. Lilien, and C. Olsen, View-based recognition using an eigenspace approximation to the hausdorff measure, IEEE Transactions on Pattern Analysis and Machine Intelligence 21 (1999), no. 9, 951-956.
S. B. Kang and R. Szeliski, 3d scene data recovery using omnidirectional multi-baseline stereo, CVPR, 1996, pp. 364-370.
N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves, P. Pirjanian, and M. Munich, The vslam algorithm for robust localization and mapping, Proc. IEEE Int. Conf. on Robotics and Automation, 2005, pp. 24-29.
A. Kosaka and A. Kak, Fast vision-guided mobile robot navigation using model-based reasoning and prediction of uncertainties, CVGIP: Image Understanding 56 (1992), no. 3, 271-329.
J. J. Leonard and H. F. Durrant-Whyte, Mobile robot localization by tracking geometric beacons, IEEE Trans. on Robotics and Automation 7 (1991), no. 3, 376-382.
R. Lerner, E. Rivlin, and I. Shimshoni, Landmark selection for task-oriented navigation, Proc. Int. Conf. on Intelligent Robots and Systems,2006, pp. 2785-2791.
LIRA-Lab, Document on specification, Tech. report, Esprit Project n. 31951-SVAVISCA - available at http://www.lira.dist.unige.it - SVAVISCA-GIOTTO Home Page, May 1999.
A. Majumder, W. Seales, G. Meenakshisundaram, and H. Fuchs, Immersive tele-conferencing: A new algorithm to generate seamless panoramic video imagery, Proceedings of the 7th ACM Conference on Multimedia, 1999.
D. Marr, Vision, W.H. Freeman, 1982.
B. McBride, Panoramic cameras time line, http://panphoto.com/TimeLine. html.
B. Micusik and T. Pajdla, Structure from motion with wide circular field of view cameras, IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI) 28 (2006), no. 7, 1135-1149.
K. Miyamoto, Fish-eye lens, Journal of the Optical Society of America 54 (1964), no. 8, 1060-1061.
L. Montesano, J. Gaspar, J. Santos-Victor, and L. Montano, Cooperative local-ization by fusing vision-based bearing measurements and motion, Int. Conf. on Intelligent Robotics and Systems, 2005, pp. 2333-2338.
H. Murase and S. K. Nayar, Visual learning and recognition of 3d objects from appearance, International Journal of Computer Vision 14 (1995), no. 1, 5-24.
V. Nalwa, A true omni-directional viewer, Technical report, Bell Laboratories, February 1996.
S. K. Nayar, Catadioptric image formation, Proc. of the DARPA Image Under-standing Workshop, May 1997, pp. 1431-1437.
——, Catadioptric omnidirectional camera, Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 1997, pp. 482-488.
S. K. Nayar and V. Peri, Folded catadioptric cameras, Proceedings of the IEEE Computer Vision and Pattern Recognition Conference, June 1999.
E. Oja, Subspace methods for pattern recognition, Research Studies Press, 1983.
M. Ollis, H. Herman, and S. Singh, Analysis and design of panoramic stereo using equi-angular pixel cameras, Tech. report, Carnegie Mellon University Robotics Institute, TR CMU-RI-TR-99-04, 1999, comes from web.
T. Pajdla and V. Hlavac, Zero phase representation of panoramic images for image based localization, 8th Inter. Conf. on Computer Analysis of Images and Patterns CAIP’99, 1999.
V. Peri and S. K. Nayar, Generation of perspective and panoramic video from omnidirectional video, Proc. DARPA Image Understanding Workshop, 1997, pp. 243-246.
R. Pless, Using many cameras as one, Proc CVPR, 2003, pp. II: 587-593.
D. Rees, Panoramic television viewing system, us patent 3 505 465, postscript file, April 1970.
W. Rucklidge, Efficient visual recognition using the hausdorff distance, Lecture Notes in Computer Science, vol. 1173, Springer-Verlag, 1996.
J. Shi and C. Tomasi, Good features to track, Proc. of the IEEE Int. Conference on Computer Vision and Pattern Recognition, June 1994, pp. 593-600.
S. Sinha and M. Pollefeys, Towards calibrating a pan-tilt-zoom camera network, OMNIVIS’04, workshop on Omnidirectional Vision and Camera Networks (held with ECCV 2004), 2004.
S.N. Sinha and M. Pollefeys, Synchronization and calibration of camera networks from silhouettes, International Conference on Pattern Recognition (ICPR’04), vol. 1, 23-26 Aug. 2004, pp. 116-119 Vol. 1.
T. Sogo, H. Ishiguro, and M. Treivedi, Real-time target localization and track-ing by n-ocular stereo, Proceedings of the 1st International IEEE Workshop on Omni-directional Vision (OMNIVIS’00) at CVPR 2000, June 2000.
M. Spetsakis and J. Aloimonos, Structure from motion using line correspon-dences, International Journal of Computer Vision 4 (1990), no. 3, 171-183.
P. Sturm, A method for 3d reconstruction of piecewise planar objects from single panoramic images, 1st International IEEE Workshop on Omnidirectional Vision at CVPR, 2000, pp. 119-126.
P. Sturm and S. Ramalingam, A generic concept for camera calibration, Proceed-ings of the European Conference on Computer Vision, Prague, Czech Republic, vol. 2, Springer, May 2004, pp. 1-13.
W. Sturzl, H. Dahmen, and H. Mallot, The quality of catadioptric imaging -application to omnidirectional stereo, European Conference on Computer Vision, 2004, pp. LNCS 3021:614-627.
T. Svoboda, T. Pajdla, and V. Hlaváč, Epipolar geometry for panoramic cam-eras, Proc. European Conf. Computer Vision, July 1998, pp. 218-231.
R. Talluri and J. K. Aggarwal, Mobile robot self-location using model-image feature correspondence, IEEE Transactions on Robotics and Automation 12 (1996), no. 1, 63-77.
G. Thomas, Real-time panospheric image dewarping and presentation for remote mobile robot control, Journal of Advanced Robotics 17 (2003), no. 4, 359-368.
S. Thrun and A. Bucken, Integrating grid-based and topological maps for mobile robot navigation, Proceedings of the 13th National Conference on Artifical Intel-ligence (AAAI’96), 1996.
S. Watanabe, Karhunen-loève expansion and factor analysis, Transactions of the 4th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, 1965, pp. 635-660.
R. Wehner and S. Wehner, Insect navigation: use of maps or ariadne’s thread?, Ethology, Ecology, Evolution 2 (1990), 27-48.
N. Winters, A holistic approach to mobile robot navigation using omnidirectional vision, Ph.D. thesis, University of Dublin, Trinity College, 2002.
N. Winters, J. Gaspar, G. Lacey, and J. Santos-Victor, Omni-directional vision for robot navigation, 1st International IEEE Workshop on Omni-directional Vision at CVPR, 2000, pp. 21-28.
N. Winters and J. Santos-Victor, Omni-directional visual navigation, 7th Inter-national Symposium on Intelligent Robotics Systems (SIRS’99), July 1999, pp. 109-118.
N. Winters and G. Lacey, Overview of tele-operation for a mobile robot, TMR Workshop on Computer Vision and Mobile Robots. (CVMR’98), September 1999.
N. Winters and J. Santos-Victor, Omni-directional visual navigation, Proc. Int. Symp. on Intelligent Robotic Systems, July 1999, pp. 109-118.
P. Wunsch and G. Hirzinger, Real-time visual tracking of 3-d objects with dynamic handling of occlusion, IEEE Int. Conf. on Robotics and Automation, April 1997, pp. 2868-2873.
Y. Yagi, Omnidirectional sensing and its applications, IEICE Transactions on Information and Systems (1999), no. E82-D-3, 568-579.
Y. Yagi, Y. Nishizawa, and M. Yachida, Map-based navigation for mobile robot with omnidirectional image sensor COPIS, IEEE Trans. Robotics and Automa-tion 11 (1995), no. 5, 634-648.
K. Yamazawa, Y. Yagi, and M. Yachida, Obstacle detection with omnidirectional image sensor hyperomni vision, IEEE ICRA, 1995, pp. 1062-1067.
J. Zheng and S. Tsuji, Panoramic representation for route recognition by a mobile robot, International Journal of Computer Vision 9 (1992), no. 1, 55-76.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Gaspar, J., Winters, N., Grossmann, E., Victor, J.S. (2007). Toward Robot Perception through Omnidirectional Vision. In: Chahl, J.S., Jain, L.C., Mizutani, A., Sato-Ilic, M. (eds) Innovations in Intelligent Machines - 1. Studies in Computational Intelligence, vol 70. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72696-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-540-72696-8_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72695-1
Online ISBN: 978-3-540-72696-8
eBook Packages: EngineeringEngineering (R0)