Vol. 14 No. 1 (2015): Revista UIS Ingenierías
Articles

Reactive navigation scheme with RGB-D sensors

Andrés Felipe Suárez Sánchez
Universidad del Valle
Bio
Humberto Loaiza Correa
Universidad del Valle
Bio

Published 2014-12-23

Keywords

  • Computer Vision,
  • mobile robotics,
  • perception,
  • navigation,
  • reconstruction,
  • kinect,
  • point cloud,
  • potential fields
  • ...More
    Less

How to Cite

Suárez Sánchez, A. F., & Loaiza Correa, H. (2014). Reactive navigation scheme with RGB-D sensors. Revista UIS Ingenierías, 14(1), 7–19. Retrieved from https://revistas.uis.edu.co/index.php/revistauisingenierias/article/view/7-19

Abstract

This paper presents the integration of a mobile robot with a vision system, which allows to a mobile agent to detect and avoid obstacles in its path. The vision system has the ability to present a 3D model of the local scene as a point clouds representation. The hardware integration is formed by an RGB-D inexpensive camera, a processing unit and a robotic platform. For visual perception, an algorithm for reduction 3D to 2D information similar to the information provided by range sensors was implemented in order to reduce response time and manipulation data for navigation algorithms. The whole system was tested in a controlled environment, overcoming various tests of navigation and obstacle avoidance. This work exploring the capabilities of RGB-D sensors in mobile robotics at Colombian Pacific  region and is the first step of a research proposal aims to implement a solution to the problem of Simultaneous Localization and Mapping (SLAM).

Downloads

Download data is not yet available.

References

  1. BUI, T.; WIDYOTRIATMO, A.; LE HOA NGUYEN;
  2. KEUM-SHIK HONG. “Sonar-based collision
  3. avoidance and probabilistic motion model for mobile
  4. robot navigation,” Asian Control Conference, 2009.
  5. ASCC 2009. 7th, vol., no., pp.1462-1467, 27-29 Aug.
  6. QIAN, K; SONG, A. “Autonomous navigation for mobile
  7. robot based on a sonar ring and its implementation”,
  8. Instrumentation and Control Technology (ISICT), 2012
  9. th IEEE International Symposium on, vol., no., pp.47-
  10. , 11-13 July 2012.
  11. NÜCHTER, A; LINGEMANN, K; HERTZBERG, J;
  12. SURMANN, H. “6D SLAM—3D Mapping Outdoor
  13. Environments” Journal of Field Robotics, 2007.
  14. COLE, D.M.; NEWMAN, P.M. “Using laser range data
  15. for 3D SLAM in outdoor environments”, Robotics and
  16. Automation, 2006. ICRA 2006. Proceedings 2006 IEEE
  17. International Conference on, vol., no., pp.1556-1563,
  18. -19 May 2006
  19. DAVISON, A.J.; REID, I.D.; MOLTON, N.D.;
  20. STASSE, O. “MonoSLAM: Real-Time Single Camera
  21. SLAM.” Pattern Analysis and Machine Intelligence,
  22. IEEE Transactions on, vol.29, no.6, pp.1052-1067,
  23. June 2007.
  24. SEO-YEON HWANG; JAE-BOK SONG. “Monocular
  25. Vision-Based SLAM in Indoor Environment Using
  26. Corner, Lamp, and Door Features From Upward-
  27. Looking Camera”, Industrial Electronics, IEEE
  28. Transactions on , vol.58, no.10, pp.4804-4812, Oct.
  29. LEMAIRE, T; BERGER, C; JUNG, I; LACROIX,
  30. S. “Vision-Based SLAM: Stereo and Monocular
  31. Approaches”. International Journal of Computer
  32. Vision 74(3), 343–364, 2007. DOI: 10.1007/s11263-
  33. -0042-3
  34. VIEJO, D.; CARLOZA, M. “Construcción de Mapas
  35. D y Extracción de Primitivas Geométricas del
  36. Entorno”; 5th Workshop de Agentes Físicos; 2004
  37. CAÑAS, J., M.; LEON, O.; CALVO, R. “Reconstrucción
  38. D visual atentiva para la navegación de un robot
  39. móvil”. IX Workshop en Agentes Físicos, Vigo, pp 205-21, 2008.
  40. TRUCCO, E; VERRI, A. Introductory Techniques for
  41. -D Computer Vision. New Jersey. Prentice Hall, 1998.
  42. ALENYÀ, G.; ESCODA, J.; MARTÍNEZ, A. B.,
  43. TORRAZ, C. “Using Laser and Vision to Locate a Robot
  44. in an Industrial Environment: A Practical Experience”,
  45. Proceedings of the 2005 IEEE International Conference
  46. on Robotics and Automation Barcelona, pp 3528-3533,
  47. MORA AGUILAR, MARTHA C; ARMESTO ÁNGEL,
  48. LEOPOLDO; TORNERO MONSERRAT, JOSEP.
  49. “Sistema de navegación de robots móviles en entornos
  50. industriales”, XXV Jornada de Automática, Ciudad
  51. Real. 2004.
  52. WU, H., BAINBRIDGE-SMITH, A. “Advantages
  53. of using a Kinect Camera in various applications”.
  54. [Documento en línea]: [consulta: 15-3-2012].
  55. ROMERO MOLANO, C. A.; DÍAZ CELIS, C.
  56. A. “Navegación de robot móvil usando Kinect y
  57. OpenCV”, 3er Congreso Internacional de Ingeniería
  58. Mecatrónica, Universidad Autónoma de Bucaramanga,
  59. Bucaramanga, Colombia, Vol 2, No 1, 2011.
  60. RUIZ-SARMIENTO, J.R.; GALINDO, C.;
  61. GONZALEZ-JIMENEZ, J., BLANCO, J.L.
  62. “Navegación Reactiva de un Robot Móvil usando
  63. Kinect”, [Documento en línea]:
  64. [consulta: 15-3-2012].
  65. HUANG, A; BACHRACH, A. “Visual odometry
  66. and mapping for autonomous fight using an RGB-D
  67. camera”, International Symposium on Robotics
  68. Research (ISRR), 2011. [Documento en línea]: [consulta: 15-3-
  69. .
  70. HENRY, P.; KRAININ, M.; HERBST, E.; REN, X.;
  71. FOX, D. “RGB-D mapping: Using depth cameras for
  72. dense 3D modeling of indoor environments”. In Proc.
  73. of the Intl. Symp. on Experimental Robotics (ISER),
  74. Delhi, India, 2010. [Documento en línea]: [consulta: 15-3-2012].
  75. ENGELHARD, N.; ENDRES, F.; HESS, J.; STURM,
  76. J.; BURGARD, W. “Real-time 3D visual SLAM with a hand-held RGB-D camera”. 2011. [Documento en
  77. línea]: [consulta: 15-3-
  78. .
  79. SMISEK, J.; JANCOSEK, M.; PAJDLA, T. “3D
  80. with Kinect,” Computer Vision Workshops (ICCV
  81. Workshops), 2011 IEEE International Conference on,
  82. vol., no., pp.1154-1160, 6-13 Nov. 2011.
  83. OpenCV: Open Source Computer Vision [Web en
  84. línea]. [consulta: 15-3-2012].
  85. PCL: Point Cloud Library [Web en línea]. [consulta: 15-3-2012].
  86. Playe/Stage: The Player Project. [Web en línea].
  87. [consulta: 15-3-
  88. .
  89. MURRAY, D.J., LITTLE, J. “Using Real-Time Stereo
  90. Vision for Mobile Robot Navigation” Computer Vision and Pattern Recognition (CVPR’98). Santa Barbara
  91. CA. 1988
  92. CASTILLO, M. A., SAÉZ, J. M. “Modelo de Sonar
  93. de Largo Alcance Basado en Tecnología Estéreo”.
  94. thWorkshop de Agentes Físicos. 2003.
  95. HAMZAH, R.A.; RAHIM, R.A.; ROSLY, H.N. “Depth
  96. evaluation in selected region of disparity mapping for
  97. navigation of stereo vision mobile robot,” Industrial
  98. Electronics & Applications (ISIEA), 2010 IEEE
  99. Symposium on, vol., no., pp.551-555, 3-5 Oct. 2010.
  100. SUÁREZ, ANDRÉS; LOAIZA, HUMBERTO.
  101. “Navegación de un robot móvil por estereovisión”.
  102. Revista entre ciencia e ingeniería, Colombia, pp 9-23,
  103. DUDEK, G.; JENKIN, M. “Computational Principles
  104. of Mobile Robotics, 2da Edition, Cambridge University
  105. Press, 2010.