Environment Virtualization for Visual Localization and Mapping
Mobile robotics has become an essential content in many subjects within most Bachelor’s and Master’s degrees in engineering. Visual sensors have emerged as a powerful tool to perform reliable localization and mapping tasks for a mobile robot. Moreover, the use of images permits achieving other high level tasks such as object and people detection, recognition, or tracking. Nonetheless, our teaching experience confirms that students encounter many difficulties before dealing with visual localization and mapping algorithms. Initial stages such as data acquisition (images and trajectory), preprocessing or visual feature extraction, usually imply a considerable effort for many students. Consequently, the teaching process is prolonged, whereas the active learning and the students’ achievement are certainly affected. Considering these facts, we have implemented a Matlab software tool to generate an open variety of virtual environments. This allows students to easily obtain synthetic raw data, according to a predefined robot trajectory inside the designed environment. The virtualization software also produces a set of images along the trajectory for performing visual localization and mapping experiments. As a result, the overall testing procedure is alleviated and students report to take better advantage of the lectures and the practical sessions, thus demonstrating higher achievement in terms of comprehension of fundamental mobile robotics concepts. Comparison results regarding the achievement of students, engagement, satisfaction and attitude to the use of the tool, are presented.
KeywordsVirtual environments Visual localization Omnidirectional image Simulation
This work has been partially supported by: the Spanish Government (DPI2016-78361-R, AEI/FEDER, UE); the Valencian Research Council and the European Social Fund (post-doctoral grant APOSTD/2017/028).
- 3.Cole, D., Newman, P.: Using laser range data for 3D SLAM in outdoor environments. In: IEEE ICRA, U.S.A., pp. 1556–1563 (2006)Google Scholar
- 4.Lee, S.J., Song, J.B.: A new sonar salient feature structure for EKF-based SLAM. In: IEEE IROS, pp. 5966–5971 (2010)Google Scholar
- 5.Ferreira, N.M.F., Freitas, E.D.C.: Robotics as multi-disciplinary learning: a summer course perspective. In: 2018 IEEE 16th International Conference on Industrial Informatics (INDIN), pp. 536–543, July 2018Google Scholar
- 6.Oliver, J., Toledo, R., Valderrama, E.: A learning approach based on robotics in computer science and computer engineering. In: IEEE EDUCON 2010 Conference, pp. 1343–1347, April 2010Google Scholar
- 9.Xuemei, L., Jiashu, C., Jinhu, L., Gang, X.: Innovative education activities with vision based robot navigation system. In: 2010 International Conference on Optics, Photonics and Energy Engineering (OPEE), vol. 2, pp. 505–507, May 2010Google Scholar
- 11.Tosello, E., Michieletto, S., Pagello, E.: Training master students to program both virtual and real autonomous robots in a teaching laboratory. In: 2016 IEEE Global Engineering Education Conference (EDUCON), pp. 621–630, April 2016Google Scholar
- 17.Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2005), vol. 1, pp. 886–893, June 2005Google Scholar
- 19.Harris, C.G., Stephens, M.: A combined corner and edge detector. In: Proceedings of Alvey Vision Conference, Manchester, UK (1988)Google Scholar
- 21.Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Proceedings of the European Conference on Computer Vision, Graz, Austria (2006)Google Scholar
- 22.Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: Proceedings of the 2011 International Conference on Computer Vision, Washington, DC, USA, pp. 2564–2571 (2011)Google Scholar