Skip to main content
Log in

Vision-based maze navigation for humanoid robots

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We present a vision-based approach for navigation of humanoid robots in networks of corridors connected through curves and junctions. The objective of the humanoid is to follow the corridors, walking as close as possible to their center to maximize motion safety, and to turn at curves and junctions. Our control algorithm is inspired by a technique originally designed for unicycle robots that we have adapted to humanoid navigation and extended to cope with the presence of turns and junctions. In addition, we prove here that the corridor following control law provides asymptotic convergence of robot heading and position to the corridor bisector even when the corridor walls are not parallel. A state transition system is designed to allow navigation in mazes of corridors, curves and T-junctions. Extensive experimental validation proves the validity and robustness of the approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

Notes

  1. We consider only the case of convergent corridor guidelines with respect to the robot direction of motion. Following corridors with divergent guidelines is limited by the dimension of the camera field of view. The approach proposed in this paper is still valid but the technical details of its application are not discussed here for lack of space.

References

  • Back, I., Kallio, J., & Makela, K. (2012). Enhanced map-based indoor navigation system of a humanoid robot using ultrasound measurements. Intelligent Control and Automation, 3(2), 111–116.

    Article  Google Scholar 

  • Baier, C., & Katoen, J. P. (2008). Principles of model checking. Cambridge, MA: MIT Press.

    MATH  Google Scholar 

  • Cherubini, A., Chaumette, F., & Oriolo, G. (2008). An image-based visual servoing scheme for following paths with nonholonomic mobile robots. In 10th IEEE international conference on control, automation, robotics and vision (pp. 108–113).

  • Courty, N., Marchand, E., & Arnaldi, B. (2001). Through-the-eyes control of a virtual humanoid. In The fourteenth conference on computer animation (pp. 74–83).

  • Cyberbotics-Webots. (2012). Webots: Robot simulation software. http://www.cyberbotics.com.

  • Delfin, J., Becerra, H., & Arechavaleta, G. (2014). Visual path following using a sequence of target images and smooth robot velocities for humanoid navigation. In 14th IEEE-RAS international conference on humanoid robots (pp. 354–359).

  • Delgado-Galvan, J., Navarro-Ramirez, A., Nunez-Varela, J., Puente-Montejano, C., & Martinez-Perez, F. (2015). Vision-based humanoid robot navigation in a featureless environment. In J. A. Carrasco-Ochoa, J. F. Martínez-Trinidad, J. H. Sossa-Azuela, J. A. Olvera López, & F. Famili (Eds.), Pattern recognition (Vol. 9116, pp. 169–178), Lecture Notes in Computer Science. New York: Springer.

  • Dune, C., Herdt, A., Stasse, O., Wieber, P. B., Yokoi, K, & Yoshida, E. (2010). Cancelling the sway motion of dynamic walking in visual servoing. In IEEE/RSJ international conference on intelligent robots and systems (pp. 3175–3180).

  • Dune, C., Herdt, A., Marchand, E., Stasse, O., Wieber, P. B., & Yoshida, E. (2011). Vision based control for humanoid robots. In IROS workshop on visual control of mobile robots (pp. 19–26).

  • Faragasso, A., Oriolo, G., Paolillo, A., & Vendittelli, M. (2013). Vision-based corridor navigation for humanoid robots. In IEEE international conference on robotics and automation (pp. 3190–3195).

  • Garcia, M., Stasse, O., & Hayet, J.B. (2014). Vision-driven walking pattern generation for humanoid reactive walking. In IEEE international conference on robotics and automation (pp. 216–221).

  • Garcia, M., Stasse, O., Hayet, J. B., Dune, C., Esteves, C., & Laumond, J. P. (2015). Vision-guided motion primitives for humanoid reactive walking: Decoupled versus coupled approaches. The International Journal of Robotics Research, 34(4–5), 402–419.

    Article  Google Scholar 

  • George, L., & Mazel, A. (2013). Humanoid robot indoor navigation based on 2D bar codes: Application to the NAO robot. In 13th IEEE-RAS international conference on humanoid robots.

  • Hartley, R. I., & Zisserman, A. (2004). Multiple view geometry in computer vision. Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • Ido, J., Shimizu, Y., Matsumoto, Y., & Ogasawara, T. (2009). Indoor navigation for a humanoid robot using a view sequence. The International Journal of Robotics Research, 28(2), 315–325.

    Article  Google Scholar 

  • Kahlil, H. K. (2001). Nonlinear systems. New York: Prentice Hall.

    Google Scholar 

  • Kuffner, Jr. J. J., Nishiwaki, K., Kagami, S., Inaba, M., & Inoue, H. (2001). Footstep planning among obstacles for biped robots. In IEEE/RSJ international conference on intelligent robots and systems (pp. 500–505).

  • Laganière, R. (2011). OpenCV 2 computer vision application programming cookbook: Over 50 recipes to master this library of programming functions for real-time computer vision. Singapore: Packt Publishing Ltd.

  • Lutz, C., Atmanspacher, F., Hornung, A., & Bennewitz, M. (2012). Nao walking down a ramp autonomously. In IEEE/RSJ international conference on intelligent robots and systems (pp. 5169–5170).

  • Ma, Y., Soatto, S., Kosecka, J., & Sastry, S. S. (2003). An invitation to 3-D vision: From images to geometric models. New York: Springer.

    MATH  Google Scholar 

  • Maier, D., Hornung, A., & Bennewitz, M. (2012). Real-time navigation in 3D environments based on depth camera data. In 12th IEEE-RAS international conference on humanoid robots (pp. 692–697).

  • Maier, D., Stachniss, C., & Bennewitz, M. (2013). Vision-based humanoid navigation using self-supervised obstacle detection. International Journal of Humanoid Robotics, 10(02), 1350,016.

    Article  Google Scholar 

  • Matsumoto, Y., Ikeda, K., Inaba, M., & Inoue, H. (2000). Exploration and navigation in corridor environment based on omni-view sequence. In IEEE/RSJ international conference on intelligent robots and systems (pp. 1505–1510).

  • Mombaur, K., Truong, A., & Laumond, J. P. (2010). From human to humanoid locomotion—an inverse optimal control approach. Autonomous Robots, 28, 369–383.

    Article  Google Scholar 

  • Ohnishi, N., & Imiya, A. (2013). Appearance-based navigation and homing for autonomous mobile robot. Image and Vision Computing, 31(6), 511–532.

    Article  Google Scholar 

  • OpenCV. (2012). Open source computer vision. http://opencv.org/.

  • Oriolo, G., Paolillo, A., Rosa, L., & Vendittelli, M. (2013). Vision-based trajectory control for humanoid navigation. In IEEE-RAS international conference on humanoid robots (pp. 118–123).

  • Oriolo, G., Paolillo, A., Rosa, L., & Vendittelli, M. (2015). Huma-noid odometric localization integrating kinematic, inertial and visual information. Autonomous Robots. doi:10.1007/s10514-015-9498-0.

  • Park, Y., & Suh, H. (2010). Predictive visual recognition of types of structural corridor landmarks for mobile robot navigation. In 19th IEEE international symposium on robot and human interactive communication (pp. 391–396).

  • Sabe, K., Fukuchi, M., Gutmann, J. S., Ohashi, T., Kawamoto, K., & Yoshigahara, T. (2004). Obstacle avoidance and path planning for humanoid robots using stereo vision. In IEEE international conference on robotics and automation (pp. 592–597).

  • Tello Gamarra, D., Bastos-Filho, T., & Sarcinelli-Filho, M. (2005). Controlling the navigation of a mobile robot in a corridor with redundant controllers. In IEEE international conference on robotics and automation (pp. 3844–3849).

  • Toibero, J. M., Soria, C.M., Roberti, F., Carelli, R., & Fiorini, P. (2009). Switching visual servoing approach for stable corridor navigation. In 14th international conference on advanced robotics (pp. 1–6).

  • Truong, T. V. A., Flavigne, D., Pettre, J., Mombaur, K., & Laumond, J. P. (2010). Reactive synthesizing of human locomotion combining nonholonomic and holonomic behaviors. In 3rd IEEE/RAS-EMBS international conference on biomedical robotics and biomechatronics (pp. 632–637).

  • Vassallo, R., Schneebeli, H. J., & Santos-Victor, J. (2000). Visual servoing and appearance for navigation. Robotics and Autonomous Systems, 31(1–2), 87–97.

    Article  Google Scholar 

  • Wei, C., Xu, J., Wang, C., Wiggers, P., & Hindriks, K. (2014). An approach to navigation for the humanoid robot nao in domestic environments. In A. Natraj, S. Cameron, C. Melhuish, & M. Witkowski (Eds.), Towards autonomous robotic systems (pp. 298–310), Lecture Notes in Computer Science. Berlin, Heidelberg: Springer.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marilena Vendittelli.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 20750 KB)

Appendix

Appendix

In this section we analyze the perturbtion terms in Eq. (11) to prove that they are vanishing and locally Lipschitz around the equilibrium point \((x_m, x_v)=(0, 0)\).

Computing the time derivative of the visual features expressions (6) and (8) in the case of non-parallel corridor walls and using the unicycle model (1) and the control designed on the nominal system (9) we obtain, for the middle point \(x_m\) the closed-loop dymanics

$$\begin{aligned} \dot{x}_m = -k_p x_m + p_m = -k_p x_m + A_m + B_m + C_m + D_m, \end{aligned}$$

where the perturbation \(p_m\) is composed by the terms

$$\begin{aligned} A_m= & {} \sigma \left( x_m \omega + k_2 v \right) \nonumber \\&\frac{1}{d k_1} \frac{x_v \left( h \tan \gamma \sec \theta - (x_v/k_1) x \right) - k_1 x}{1 - (\sigma /d) \left( y- (x_v/k_1) x \right) }, \nonumber \\ B_m= & {} E_m\tan ^2\theta (k_3 \omega + k_2 v \tan \theta ), \nonumber \\ C_m= & {} E_m \tan \theta \left( 2 x_m \sec ^2 \theta \omega + k_2 v \right) , \nonumber \\ D_m= & {} E_m\theta \left( k_3 + k_2 \sec ^3\theta (y - d/\sigma )\right) \omega , \end{aligned}$$
(12)

with

$$\begin{aligned} E_m = \frac{\sigma ^2}{1-\sigma ^2 \tan ^2 \theta }. \end{aligned}$$

To show that the perturbation is vanishing we should invert the map \((x,\theta )\rightarrow (x_m, x_v)\) so as to express the perturbation term as a function of \((x_m, x_v)\) only. However, in Proposition 1 we have shown that at the equilibrium \((x_m, x_v)^T=(0, 0)^T \Rightarrow (x, \theta )^T=(0, 0)^T\) except on points of the circle with center \(P_V\), the intersection point of the two corridor guidelines, and radius \(r_V=\frac{h}{\tan \gamma }\).

These points, however, cannot be stable equilibria of the closed loop dynamics, as also shown in Proposition 1. In addition, considering the robot footprint with respect to the corridor width, these points can hardly be reached in practical situations. Hence, by ignoring these points, we can evaluate the perturbation terms at the equilibrium of the nominal system by setting \((x_m, x_v)=(0, 0)\) and \((x, \theta )=(0,0)\).

A quick inspection of Eq. (12) provides evidence that \(p_m\) is null at the equilibrium point \((x_m, x_v) = (0, 0)\), implying that the perturbation induced by the non-parallel corridor guidelines on the closed loop nominal dynamics is non-persistent.

The perturbation term is composed by sums and products of functions that are locally Lipschitz around the equilibrium of the nominal system with the exception of the terms \(A_m\) and \(E_m\) presenting, respectively, the following singularities.

  • \(y-y_V = \frac{x_v}{k_1}x\): this singularity cannot be met around the equilibrium point since this would imply that the robot is very close to the point \(P_V\) at the intersection point of the corridor guidelines, a situation physically impossible;

  • \(\tan \theta = \pm 1/\sigma \): is verified if the robot heading is perpendicular to the corridor walls, a situation not possible around the origin of the system if \(\sigma \ne 0\).

Finally, given the terms in (12), \(p_m\) can be easily expressed as \( p_m = \sigma \tilde{p}_m. \)

To prove that the perturbation of the vanishing point closed-loop dynamics is non-persisting and locally Lipstchiz at the equilibrium of the nominal system we proceed in a way analogous to the case of the middle point. The closed loop dynamics of \(x_v\) in case of non-parallel guidelines is

$$\begin{aligned} \dot{x}_v = f_v(x_m,x_v) + D_v + E_v = f_v (x_m,x_v) + p_v, \end{aligned}$$

where \(f_v (x_m,x_v)\) represents the nominal closed-loop dynamics in Eq. (10), while the perturbation \(p_v\) is composed of the following terms

$$\begin{aligned} E_v= & {} \sigma ^2 \frac{1}{k_1 d^2} \left( \frac{ x_v (h \tan \gamma \sec \theta - (x_v / k_1) x) - k_1 x}{1 - (\sigma /d) \left( y- (x_v/k_1) x \right) } \right) ^2 \omega \\ D_v= & {} \frac{(\dot{B}_v-\dot{A}_v C_v)(1+C_v)-\dot{C}_v(A_v+B_v)}{(1+C_v)^2} \end{aligned}$$

where

$$\begin{aligned} A_v= & {} k_1 \tan \theta ,\\ \dot{A}_v= & {} k_1 (\tan ^2\theta +1 ) \omega ,\\ B_v= & {} \sigma k_1 \frac{1}{d} (x - y \tan \theta ),\\ \dot{B}_v= & {} -\frac{\sigma }{d}y \dot{A}_v,\\ C_v= & {} \sigma \frac{1}{d}\left( h\tan \gamma \sec \theta - x \tan \theta - y\right) ,\\ \dot{C}_v= & {} \sigma \frac{1}{d} \sec \theta \left( \left( h \tan \gamma s_\theta - x \right) \sec \theta \omega - v \right) . \end{aligned}$$

As in the previous case, we can prove that the perturbation vanishes if we set \((x_m, x_v)=(0, 0)\) and \((x, \theta )=(0, 0)\). Also in this case, the perturbation term has some singularities. In particular, the term \(E_v\) presents the same singularity as \(A_m\) in Eq. (12) discussed above. The term \(1+C_v\) at the denominator of \(D_v\) is null if the following equation is verified

$$\begin{aligned} (y_V - y)\cos \theta - x \sin \theta = - h \tan \gamma . \end{aligned}$$

The simple geometric construction in Fig. 18 shows that the term on the left represents the signed distance of the robot to the line orthogonal to the optical axis (directed as the robot heading) and passing through the point \(P_V\) where the corridor guidelines intersect. This distance becomes negative, and eventually equal to \(h\tan \gamma \), if the robot crosses this line. Around the equilibrium point this would mean that the robot intersects this line close to \(P_V\), again a non-operative condition.

Fig. 18
figure 18

Geometric interpretation of the \(D_v\) singularity

We can then state that around the equilibrium the perturbation term \(p_v\) is locally Lipschitz being given by sums and products of locally Lipschitz functions. Analogously to the perturbation of the nominal middle point dynamics, \(p_v\) can be written as \(p_v = \sigma \tilde{p}_v\).

Wrapping up, the perturbation terms generated by the non-parallel wall corridor condition have been shown to be vanishing and locally Lipschitz around the equilibrium of the nominal dynamics. Furthermore, the perturbation term is proportional to the perturbation parameter \(\sigma \) representing the corridor walls relative slope.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Paolillo, A., Faragasso, A., Oriolo, G. et al. Vision-based maze navigation for humanoid robots. Auton Robot 41, 293–309 (2017). https://doi.org/10.1007/s10514-015-9533-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-015-9533-1

Keywords

Navigation