Abstract
We present a visionbased approach for navigation of humanoid robots in networks of corridors connected through curves and junctions. The objective of the humanoid is to follow the corridors, walking as close as possible to their center to maximize motion safety, and to turn at curves and junctions. Our control algorithm is inspired by a technique originally designed for unicycle robots that we have adapted to humanoid navigation and extended to cope with the presence of turns and junctions. In addition, we prove here that the corridor following control law provides asymptotic convergence of robot heading and position to the corridor bisector even when the corridor walls are not parallel. A state transition system is designed to allow navigation in mazes of corridors, curves and Tjunctions. Extensive experimental validation proves the validity and robustness of the approach.
This is a preview of subscription content, log in to check access.
Access options
Buy single article
Instant unlimited access to the full article PDF.
US$ 39.95
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
US$ 99
This is the net price. Taxes to be calculated in checkout.
Notes
 1.
We consider only the case of convergent corridor guidelines with respect to the robot direction of motion. Following corridors with divergent guidelines is limited by the dimension of the camera field of view. The approach proposed in this paper is still valid but the technical details of its application are not discussed here for lack of space.
References
Back, I., Kallio, J., & Makela, K. (2012). Enhanced mapbased indoor navigation system of a humanoid robot using ultrasound measurements. Intelligent Control and Automation, 3(2), 111–116.
Baier, C., & Katoen, J. P. (2008). Principles of model checking. Cambridge, MA: MIT Press.
Cherubini, A., Chaumette, F., & Oriolo, G. (2008). An imagebased visual servoing scheme for following paths with nonholonomic mobile robots. In 10th IEEE international conference on control, automation, robotics and vision (pp. 108–113).
Courty, N., Marchand, E., & Arnaldi, B. (2001). Throughtheeyes control of a virtual humanoid. In The fourteenth conference on computer animation (pp. 74–83).
CyberboticsWebots. (2012). Webots: Robot simulation software. http://www.cyberbotics.com.
Delfin, J., Becerra, H., & Arechavaleta, G. (2014). Visual path following using a sequence of target images and smooth robot velocities for humanoid navigation. In 14th IEEERAS international conference on humanoid robots (pp. 354–359).
DelgadoGalvan, J., NavarroRamirez, A., NunezVarela, J., PuenteMontejano, C., & MartinezPerez, F. (2015). Visionbased humanoid robot navigation in a featureless environment. In J. A. CarrascoOchoa, J. F. MartínezTrinidad, J. H. SossaAzuela, J. A. Olvera López, & F. Famili (Eds.), Pattern recognition (Vol. 9116, pp. 169–178), Lecture Notes in Computer Science. New York: Springer.
Dune, C., Herdt, A., Stasse, O., Wieber, P. B., Yokoi, K, & Yoshida, E. (2010). Cancelling the sway motion of dynamic walking in visual servoing. In IEEE/RSJ international conference on intelligent robots and systems (pp. 3175–3180).
Dune, C., Herdt, A., Marchand, E., Stasse, O., Wieber, P. B., & Yoshida, E. (2011). Vision based control for humanoid robots. In IROS workshop on visual control of mobile robots (pp. 19–26).
Faragasso, A., Oriolo, G., Paolillo, A., & Vendittelli, M. (2013). Visionbased corridor navigation for humanoid robots. In IEEE international conference on robotics and automation (pp. 3190–3195).
Garcia, M., Stasse, O., & Hayet, J.B. (2014). Visiondriven walking pattern generation for humanoid reactive walking. In IEEE international conference on robotics and automation (pp. 216–221).
Garcia, M., Stasse, O., Hayet, J. B., Dune, C., Esteves, C., & Laumond, J. P. (2015). Visionguided motion primitives for humanoid reactive walking: Decoupled versus coupled approaches. The International Journal of Robotics Research, 34(4–5), 402–419.
George, L., & Mazel, A. (2013). Humanoid robot indoor navigation based on 2D bar codes: Application to the NAO robot. In 13th IEEERAS international conference on humanoid robots.
Hartley, R. I., & Zisserman, A. (2004). Multiple view geometry in computer vision. Cambridge: Cambridge University Press.
Ido, J., Shimizu, Y., Matsumoto, Y., & Ogasawara, T. (2009). Indoor navigation for a humanoid robot using a view sequence. The International Journal of Robotics Research, 28(2), 315–325.
Kahlil, H. K. (2001). Nonlinear systems. New York: Prentice Hall.
Kuffner, Jr. J. J., Nishiwaki, K., Kagami, S., Inaba, M., & Inoue, H. (2001). Footstep planning among obstacles for biped robots. In IEEE/RSJ international conference on intelligent robots and systems (pp. 500–505).
Laganière, R. (2011). OpenCV 2 computer vision application programming cookbook: Over 50 recipes to master this library of programming functions for realtime computer vision. Singapore: Packt Publishing Ltd.
Lutz, C., Atmanspacher, F., Hornung, A., & Bennewitz, M. (2012). Nao walking down a ramp autonomously. In IEEE/RSJ international conference on intelligent robots and systems (pp. 5169–5170).
Ma, Y., Soatto, S., Kosecka, J., & Sastry, S. S. (2003). An invitation to 3D vision: From images to geometric models. New York: Springer.
Maier, D., Hornung, A., & Bennewitz, M. (2012). Realtime navigation in 3D environments based on depth camera data. In 12th IEEERAS international conference on humanoid robots (pp. 692–697).
Maier, D., Stachniss, C., & Bennewitz, M. (2013). Visionbased humanoid navigation using selfsupervised obstacle detection. International Journal of Humanoid Robotics, 10(02), 1350,016.
Matsumoto, Y., Ikeda, K., Inaba, M., & Inoue, H. (2000). Exploration and navigation in corridor environment based on omniview sequence. In IEEE/RSJ international conference on intelligent robots and systems (pp. 1505–1510).
Mombaur, K., Truong, A., & Laumond, J. P. (2010). From human to humanoid locomotion—an inverse optimal control approach. Autonomous Robots, 28, 369–383.
Ohnishi, N., & Imiya, A. (2013). Appearancebased navigation and homing for autonomous mobile robot. Image and Vision Computing, 31(6), 511–532.
OpenCV. (2012). Open source computer vision. http://opencv.org/.
Oriolo, G., Paolillo, A., Rosa, L., & Vendittelli, M. (2013). Visionbased trajectory control for humanoid navigation. In IEEERAS international conference on humanoid robots (pp. 118–123).
Oriolo, G., Paolillo, A., Rosa, L., & Vendittelli, M. (2015). Humanoid odometric localization integrating kinematic, inertial and visual information. Autonomous Robots. doi:10.1007/s1051401594980.
Park, Y., & Suh, H. (2010). Predictive visual recognition of types of structural corridor landmarks for mobile robot navigation. In 19th IEEE international symposium on robot and human interactive communication (pp. 391–396).
Sabe, K., Fukuchi, M., Gutmann, J. S., Ohashi, T., Kawamoto, K., & Yoshigahara, T. (2004). Obstacle avoidance and path planning for humanoid robots using stereo vision. In IEEE international conference on robotics and automation (pp. 592–597).
Tello Gamarra, D., BastosFilho, T., & SarcinelliFilho, M. (2005). Controlling the navigation of a mobile robot in a corridor with redundant controllers. In IEEE international conference on robotics and automation (pp. 3844–3849).
Toibero, J. M., Soria, C.M., Roberti, F., Carelli, R., & Fiorini, P. (2009). Switching visual servoing approach for stable corridor navigation. In 14th international conference on advanced robotics (pp. 1–6).
Truong, T. V. A., Flavigne, D., Pettre, J., Mombaur, K., & Laumond, J. P. (2010). Reactive synthesizing of human locomotion combining nonholonomic and holonomic behaviors. In 3rd IEEE/RASEMBS international conference on biomedical robotics and biomechatronics (pp. 632–637).
Vassallo, R., Schneebeli, H. J., & SantosVictor, J. (2000). Visual servoing and appearance for navigation. Robotics and Autonomous Systems, 31(1–2), 87–97.
Wei, C., Xu, J., Wang, C., Wiggers, P., & Hindriks, K. (2014). An approach to navigation for the humanoid robot nao in domestic environments. In A. Natraj, S. Cameron, C. Melhuish, & M. Witkowski (Eds.), Towards autonomous robotic systems (pp. 298–310), Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.
Author information
Electronic supplementary material
Below is the link to the electronic supplementary material.
Supplementary material 1 (mp4 20750 KB)
Appendix
Appendix
In this section we analyze the perturbtion terms in Eq. (11) to prove that they are vanishing and locally Lipschitz around the equilibrium point \((x_m, x_v)=(0, 0)\).
Computing the time derivative of the visual features expressions (6) and (8) in the case of nonparallel corridor walls and using the unicycle model (1) and the control designed on the nominal system (9) we obtain, for the middle point \(x_m\) the closedloop dymanics
where the perturbation \(p_m\) is composed by the terms
with
To show that the perturbation is vanishing we should invert the map \((x,\theta )\rightarrow (x_m, x_v)\) so as to express the perturbation term as a function of \((x_m, x_v)\) only. However, in Proposition 1 we have shown that at the equilibrium \((x_m, x_v)^T=(0, 0)^T \Rightarrow (x, \theta )^T=(0, 0)^T\) except on points of the circle with center \(P_V\), the intersection point of the two corridor guidelines, and radius \(r_V=\frac{h}{\tan \gamma }\).
These points, however, cannot be stable equilibria of the closed loop dynamics, as also shown in Proposition 1. In addition, considering the robot footprint with respect to the corridor width, these points can hardly be reached in practical situations. Hence, by ignoring these points, we can evaluate the perturbation terms at the equilibrium of the nominal system by setting \((x_m, x_v)=(0, 0)\) and \((x, \theta )=(0,0)\).
A quick inspection of Eq. (12) provides evidence that \(p_m\) is null at the equilibrium point \((x_m, x_v) = (0, 0)\), implying that the perturbation induced by the nonparallel corridor guidelines on the closed loop nominal dynamics is nonpersistent.
The perturbation term is composed by sums and products of functions that are locally Lipschitz around the equilibrium of the nominal system with the exception of the terms \(A_m\) and \(E_m\) presenting, respectively, the following singularities.

\(yy_V = \frac{x_v}{k_1}x\): this singularity cannot be met around the equilibrium point since this would imply that the robot is very close to the point \(P_V\) at the intersection point of the corridor guidelines, a situation physically impossible;

\(\tan \theta = \pm 1/\sigma \): is verified if the robot heading is perpendicular to the corridor walls, a situation not possible around the origin of the system if \(\sigma \ne 0\).
Finally, given the terms in (12), \(p_m\) can be easily expressed as \( p_m = \sigma \tilde{p}_m. \)
To prove that the perturbation of the vanishing point closedloop dynamics is nonpersisting and locally Lipstchiz at the equilibrium of the nominal system we proceed in a way analogous to the case of the middle point. The closed loop dynamics of \(x_v\) in case of nonparallel guidelines is
where \(f_v (x_m,x_v)\) represents the nominal closedloop dynamics in Eq. (10), while the perturbation \(p_v\) is composed of the following terms
where
As in the previous case, we can prove that the perturbation vanishes if we set \((x_m, x_v)=(0, 0)\) and \((x, \theta )=(0, 0)\). Also in this case, the perturbation term has some singularities. In particular, the term \(E_v\) presents the same singularity as \(A_m\) in Eq. (12) discussed above. The term \(1+C_v\) at the denominator of \(D_v\) is null if the following equation is verified
The simple geometric construction in Fig. 18 shows that the term on the left represents the signed distance of the robot to the line orthogonal to the optical axis (directed as the robot heading) and passing through the point \(P_V\) where the corridor guidelines intersect. This distance becomes negative, and eventually equal to \(h\tan \gamma \), if the robot crosses this line. Around the equilibrium point this would mean that the robot intersects this line close to \(P_V\), again a nonoperative condition.
We can then state that around the equilibrium the perturbation term \(p_v\) is locally Lipschitz being given by sums and products of locally Lipschitz functions. Analogously to the perturbation of the nominal middle point dynamics, \(p_v\) can be written as \(p_v = \sigma \tilde{p}_v\).
Wrapping up, the perturbation terms generated by the nonparallel wall corridor condition have been shown to be vanishing and locally Lipschitz around the equilibrium of the nominal dynamics. Furthermore, the perturbation term is proportional to the perturbation parameter \(\sigma \) representing the corridor walls relative slope.
Rights and permissions
About this article
Cite this article
Paolillo, A., Faragasso, A., Oriolo, G. et al. Visionbased maze navigation for humanoid robots. Auton Robot 41, 293–309 (2017) doi:10.1007/s1051401595331
Received:
Accepted:
Published:
Issue Date:
Keywords
 Visionbased navigation
 Humanoid robots
 Visual control