Skip to main content

Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors


The combination of visual and inertial sensors for state estimation has recently found wide echo in the robotics community, especially in the aerial robotics field, due to the lightweight and complementary characteristics of the sensors data. However, most state estimation systems based on visual-inertial sensing suffer from severe processor requirements, which in many cases make them impractical. In this paper, we propose a simple, low-cost and high rate method for state estimation enabling autonomous flight of micro aerial vehicles, which presents a low computational burden. The proposed state estimator fuses observations from an inertial measurement unit, an optical flow smart camera and a time-of-flight range sensor. The smart camera provides optical flow measurements up to a rate of 200 Hz, avoiding the computational bottleneck to the main processor produced by all image processing requirements. To the best of our knowledge, this is the first example of extending the use of these smart cameras from hovering-like motions to odometry estimation, producing estimates that are usable during flight times of several minutes. In order to validate and defend the simplest algorithmic solution, we investigate the performances of two Kalman filters, in the extended and error-state flavors, alongside with a large number of algorithm modifications defended in earlier literature on visual-inertial odometry, showing that their impact on filter performance is minimal. To close the control loop, a non-linear controller operating in the special Euclidean group SE(3) is able to drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. All the estimation and control tasks are solved on board and in real time on a limited computational unit. The proposed approach is validated through simulations and experimental results, which include comparisons with ground-truth data provided by a motion capture system. For the benefit of the community, we make the source code public.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9


  1. Note that in EKF the orientation error is additive and this distinction is irrelevant.






  • Bar-Shalom, Y., Li, X. R., & Kirubarajan, T. (2004). Estimation with applications to tracking and navigation: Theory algorithms and software. Hoboken: Wiley.

    Google Scholar 

  • Blösch, M., Omari, S., Fankhauser, P., Sommer, H., Gehring, C., Hwangbo, J., Hoepflinger, M., Hutter, M., & Siegwart, R. (2014). Fusion of optical flow and inertial measurements for robust egomotion estimation. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 3102–3107). Chicago.

  • Blösch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based MAV navigation in unknown and unstructured environments. In Proceedings of the IEEE international conference on robotics and automation (pp. 21–28). Anchorage.

  • Bullo, F., & Lewis, A. (2004). Geometric control of mechanical systems. Berlin: Springer.

    MATH  Google Scholar 

  • Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., & Scaramuzza, D. (2016). Autonomous, vision-based flight and live dense 3D mapping with a quadrotor micro aerial vehicle. Journal of Field Robotics, 33(4), 431–450.

    Article  Google Scholar 

  • Fliess, M., Lévine, J., Martin, P., & Rouchon, P. (1995). Flatness and defect of non-linear systems: Introductory theory and examples. International Journal of Control, 61(6), 1327–1361.

    MathSciNet  Article  MATH  Google Scholar 

  • Forster, C., Pizzoli, M., & Scaramuzza, D. (2014). SVO: Fast semi-direct monocular visual odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 15–22). Hong Kong.

  • Forte, F., Naldi, R., & Marconi, L. (2012). Impedance control of an aerial manipulator. In Proceedings of the American control conference (pp. 3839–3844). Montreal.

  • Fraundorfer, F., Heng, L., Honegger, D., Lee, G. H., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Vision-based autonomous mapping and exploration using a quadrotor MAV. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4557–4564). Vilamoura.

  • Hérissé, B., Hamel, T., Mahony, R., & Russotto, F. X. (2012). Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow. IEEE Transactions on Robotics, 28(1), 77–89.

    Article  Google Scholar 

  • Hesch, J. A., Kottas, D. G., Bowman, S. L., & Roumeliotis, S. I. (2014). Camera-IMU-based localization: Observability analysis and consistency improvement. The International Journal of Robotics Research, 33(1), 182–201.

    Article  Google Scholar 

  • Honegger, D., Lorenz, M., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE international conference on robotics and automation (pp. 1736–1741). Karlsruhe.

  • Jones, E. S., & Soatto, S. (2011). Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research, 30(4), 407–430.

    Article  Google Scholar 

  • Kelly, J., & Sukhatme, G. S. (2011). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 56–79.

    Article  Google Scholar 

  • Lee, T., Leok, M., & McClamroch, N. H. (2013). Nonlinear robust tracking control of a quadrotor UAV on SE(3). Asian Journal of Control, 15(2), 391–408.

    MathSciNet  Article  MATH  Google Scholar 

  • Li, M., & Mourikis, A. I. (2012). Improving the accuracy of EKF-based visual-inertial odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 828–835). Saint Paul.

  • Li, M., & Mourikis, A. I. (2013). High-precision, consistent EKF-based visual-inertial odometry. The International Journal of Robotics Research, 32(6), 690–711.

    Article  Google Scholar 

  • Liu, H., Darabi, H., Banerjee, P., & Liu, J. (2007). Survey of wireless indoor positioning techniques and systems. IEEE Transactions on Systems, Man, and Cybernetics, 37(6), 1067–1080.

    Article  Google Scholar 

  • Loianno, G., Mulgaonkar, Y., Brunner, C., Ahuja, D., Ramanandan, A., Chari, M., et al. (2015a). Smartphones power flying robots. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1256–1263). Hamburg.

  • Loianno, G., Thomas, J., & Kumar, V. (2015b). Cooperative localization and mapping of MAVs using RGB-D sensors. In Proceedings of the IEEE international conference on robotics and automation (pp. 4021–4028). Seattle.

  • Madyastha, V. K., Ravindra, V. C., Mallikarjunan, S., & Goyal, A. (2011) Extended Kalman filter vs. error state Kalman filter for aircraft attitude estimation. In Proceedings of the AIAA guidance, navigation, and control conference (pp. 6615–6638). Portland.

  • Martinelli, A. (2012). Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Transactions on Robotics, 28(1), 44–60.

    Article  Google Scholar 

  • Martinelli, A. (2013). Visual-inertial structure from motion: Observability and resolvability. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4235–4242). Tokyo.

  • Meier, L., Tanskanen, P., Heng, L., Lee, G., Fraundorfer, F., & Pollefeys, M. (2012). PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Autonomous Robots, 33(1–2), 21–39.

    Article  Google Scholar 

  • Mellinger, D., & Kumar, V. (2011). Minimum snap trajectory generation and control for quadrotors. In Proceedings of the IEEE international conference on robotics and automation (pp. 2520–2525). Shanghai.

  • Michael, N., Mellinger, D., Lindsey, Q., & Kumar, V. (2010). The grasp multiple micro-UAV test bed. IEEE Robotics & Automation Magazine, 17(3), 56–65.

    Article  Google Scholar 

  • Michael, N., Shen, S., Mohta, K., Kumar, V., Nagatani, K., Okada, Y., et al. (2012). Collaborative mapping of an earthquake-damaged building via ground and aerial robots. Journal of Field Robotics, 29(5), 832–841.

    Article  Google Scholar 

  • Nikolic, J., Rehder, J., Burri, M., Gohl, P., Leutenegger, S., Furgale, P. T., & Siegwart, R. (2014). A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM. In Proceedings of the IEEE international conference on robotics and automation (pp. 431–437). Hong Kong.

  • Omari, S., & Ducard, G. (2013). Metric visual-inertial navigation system using single optical flow feature. In Proceedings of the European control conference (pp. 1310–1316). Zurich.

  • Ozaslan, T., Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2013). Inspection of penstocks and featureless tunnel-like environments using micro UAVs. In Proceedings of the conference on field and service robotics (pp. 123–136). Brisbane.

  • Ravindra, V., Madyastha, V., & Goyal, A. (2012). The equivalence between two well-known variants of the Kalman filter. In: Proc. Adv. Cont. Opt. of Dynamic Syst. Bangalore.

  • Rossi, R., Santamaria-Navarro A., Andrade-Cetto J., & Rocco P. (2017). Trajectory generation for unmanned aerial manipulators through quadratic programming. IEEE Robotics and Automation Letters, 2(2), 389–396.

  • Roumeliotis, S. I., Johnson, A. E., & Montgomery, J. F. (2002) Augmenting inertial navigation with image-based motion estimation. In Proceedings of the IEEE international conference on robotics and automation (Vol. 4, pp. 4326–4333). Washington.

  • Roussillon, C., Gonzalez, A., Solà, J., Codol, J. M., Mansard, N., Lacroix, S., & Devy, M. (2011). RT-SLAM: A generic and real-time visual SLAM implementation. In J. L. Crowley, B. A. Draper, & M. Thonnat (Eds.), Computer vision systems, lecture notes in computer science (Vol. 6962, pp. 31–40). Berlin, Heidelberg: Springer.

  • Ruffo, M., Di Castro, M., Molinari, L., Losito, R., Masi, A., Kovermann, J., & Rodrigues, L. (2014). New infrared time-of-flight measurement sensor for robotic platforms. In Proceedings of the international symposium work on ADC modelling and testing (pp. 13–18).

  • Santamaria-Navarro A., Grosch P., Lippiello V., Solà J., & Andrade-Cetto J. (2017). Uncalibrated visual servo for unmanned aerial manipulation. IEEE/ASME Transactions on Mechatronics, 22(4),1610–1621.

  • Santamaria-Navarro, A., Lipiello, V., & Andrade-Cetto, J. (2014). Task priority control for aerial manipulation. In Proceedings of the IEEE international symposium on safety security and rescue robotics (pp. 1–6). Toyako-cho.

  • Santamaria-Navarro, A., Solà, J., & Andrade-Cetto, J. (2015). High-frequency MAV state estimation using low-cost inertial and optical flow measurement units. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1864–1871). Hamburg.

  • Shen, S., Michael, N., & Kumar, V. (2012). Autonomous indoor 3D exploration with a micro-aerial vehicle. In Proceedings of the IEEE international conference on robotics and automation (pp. 9–15). Saint Paul.

  • Shen, S., Mulgaonkar Y., Michael N., Kumar V. (2013). Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In Proceedings of the robotics science and system. Berlin.

  • Solà, J. (2015) Quaternion kinematics for the error-state KF., hal-01122406, v5 (in preparation).

  • Solà, J., Vidal-Calleja, T., Civera, J., & Montiel, J. M. M. (2011). Impact of landmark parametrization on monocular EKF-SLAM with points and lines. International Journal of Computer Vision, 97(3), 339–368.

    MathSciNet  Article  MATH  Google Scholar 

  • Thomas, J., Loianno, G., Sreenath, K., & Kumar, V. (2014). Toward image based visual servoing for aerial grasping and perching. In Proceedings of the IEEE international conference on robotics and automation (pp. 2113–2118). Hong Kong.

  • Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., et al. (2012). Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue. IEEE Robotics & Automation Magazine, 19(3), 46–56.

    Article  Google Scholar 

  • Trawny, N., & Roumeliotis, S. I. (2005) Indirect Kalman filter for 3D attitude estimation. University of Minnesota, Department of Computer Science & Engineering, Tech. Rep 2, rev. 57.

  • Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. Proceedings of the IEEE international conference on robotics and automation (pp. 957–964). Saint Paul.

  • Weiss, S., Scaramuzza, D., & Siegwart, R. (2011). Monocular-SLAM-based navigation for autonomous micro helicopters in gps denied environments. Journal of Field Robotics, 28(6), 854–874.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Angel Santamaria-Navarro.

Additional information

This work was funded by the project ROBINSTRUCT (TIN2014-58178-R) and the Ramón y Cajal posdoctoral fellowship (RYC-2012-11604) from the Spanish Ministry of Economy and Competitiveness, by the Spanish State Research Agency through the María de Maeztu Seal of Excellence to IRI (MDM-2016-0656) and by the EU H2020 project AEROARMS (H2020-ICT-2014-1-644271).

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 9320 KB)


Appendix A: Quaternion conventions and properties

We use, as in Solà (2015), the Hamilton convention for quaternions. If we denote a quaternion \({}^G\mathbf{q}_L\) representing the orientation of a local frame L with respect to a global frame G, then a generic composition of two quaternions is defined as

$$\begin{aligned} {}^G\mathbf{q}_{C}\,=\,{}^G\mathbf{q}_{L}\otimes {}^L\mathbf{q}_{C} ={}^G\mathbf{Q}^{+}_{L}\,{}^L\mathbf{q}_{C}\,=\,{}^L\mathbf{Q}^{-}_{C}\,{}^G\mathbf{q}_{L}, \end{aligned}$$

where, for a quaternion \(\mathbf{q}=[w,x,y,z]^\top \), we can define \(\mathbf{Q}^{+}\) and \(\mathbf{Q}^{-}\) respectively as the left- and right-quaternion product matrices,

$$\begin{aligned} \mathbf{Q}^{+} = \begin{bmatrix} w&-x&-y&-z \\ x&w&-z&y \\ y&z&w&-x \\ z&-y&x&w \end{bmatrix}~,~ \mathbf{Q}^{-} = \begin{bmatrix} w&-x&-y&-z \\ x&w&z&-y \\ y&-z&w&x \\ z&y&-x&w \end{bmatrix}\,. \end{aligned}$$

In the quaternion product, we notice how the right-hand quaternion is defined locally in the frame L, which is specified by the left-hand quaternion. Vector transformation from a local frame L to the global G is performed by the double product

$$\begin{aligned} {}^G\mathbf{v} = {}^G\mathbf{q}_{L}\otimes {}^L\mathbf{v}\otimes ({}^G\mathbf{q}_{L})^* = {}^G\mathbf{q}_{L}\otimes {}^L\mathbf{v}\otimes {}^L\mathbf{q}_{G}, \end{aligned}$$

where we use the shortcut \(\mathbf{q}\otimes \mathbf{v} \equiv \mathbf{q}\otimes [0,\mathbf{v}]^\top \) for convenience of notation.

Throughout the paper, we note \(\mathbf{q}\{x\}\) the quaternion and \(\mathbf{R}\{x\}\) the rotation matrix equivalents to a generic orientation x. A rotation \(\varvec{\theta }= \theta \mathbf u\), of \(\theta \) radians around the unit axis \(\mathbf u\), can be expressed in quaternion and matrix forms using the exponential maps

$$\begin{aligned} \mathbf{q}\{{\varvec{\theta }}\} =&\, e^{{\varvec{\theta }}/2} = \begin{bmatrix} \cos (\theta /2) \\ \mathbf{u}\sin (\theta /2) \end{bmatrix} \xrightarrow [\theta \rightarrow 0]{} \begin{bmatrix} 1 \\ {\varvec{\theta }}/2 \end{bmatrix} \,, \end{aligned}$$
$$\begin{aligned} \mathbf{R}\{{\varvec{\theta }}\} =&\, e^{[{\varvec{\theta }}]_\times } = \mathbf{I} \!+\! \sin \theta [\mathbf{u}]_\times \!+\! (1 \!-\! \cos \theta )[\mathbf{u}]_\times ^2\nonumber \\&\xrightarrow [\theta \rightarrow 0]{} \mathbf{I} \!+\! [{\varvec{\theta }}]_\times \end{aligned}$$

We also write \(\mathbf{R}\,=\,\mathbf{R}\{\mathbf{q}\}\), according to

$$\begin{aligned} {\mathbf{R}\{\mathbf{q}\} = \begin{bmatrix} w^2\!+\!x^2\!-\!y^2\!-\!z^2&2(xy-wz)&2(xz + wy) \\ 2(xy + wz)&w^2\!-\!x^2\!+\!y^2\!-\!z^2&2(yz - wx) \\ 2(xz - wy)&2(yz + wx)&w^2\!-\!x^2\!-\!y^2\!+\!z^2 \end{bmatrix}} \end{aligned}$$

Finally, the time-derivative of the quaternion is

$$\begin{aligned} \dot{\mathbf{q}} = \frac{1}{2}{\varvec{\Omega }}({\varvec{\omega }})\mathbf{q} = \frac{1}{2}{} \mathbf{q}\otimes {\varvec{\omega }}\,, \end{aligned}$$

with \({\varvec{\omega }}\) the angular velocity in body frame, and \({\varvec{\Omega }}\) the skew-symmetric matrix defined as

$$\begin{aligned} {\varvec{\Omega }}({\varvec{\omega }}) \triangleq \mathbf{Q}^-({\varvec{\omega }}) = \begin{bmatrix} 0&-{\varvec{\omega }}^\top \\ {\varvec{\omega }}&-[{\varvec{\omega }}]_\times \end{bmatrix}. \end{aligned}$$

Appendix B: Filter transition matrices

We detail the construction of the filter transition matrix for the three involved integrals: ESKF nominal—(22), ESKF error—(23), and EKF true—(25) kinematics. For each case, we need to define the matrix \(\mathbf{A}\) as the Jacobian of the respective continuous-time system, and build the transition matrix \(\mathbf{F}_N\) as the truncated Taylor series (20), i.e.,

$$\begin{aligned} \mathbf{F}_N = \sum _{n=0}^N \frac{1}{n!}{} \mathbf{A}^n\Delta t^n = \mathbf{I}+\mathbf{A}\Delta t +\frac{1}{2\,!}{} \mathbf{A}^2\Delta t^2 +\cdots \end{aligned}$$

In the following paragraphs, we detail the matrices \(\mathbf{A}\) for each case, and some examples of their first powers up to \(n=3\). The reader should find no difficulties in building the powers of \(\mathbf{A}\) that have not been detailed, and the transition matrices \(\mathbf{F}_N\) using the Taylor series above.

The Jacobian \(\mathbf{A}=\partial f(\mathbf{x},\delta \mathbf{x},\cdot )/\partial \delta \mathbf{x}\) of the ESKF’s continuous time error-state system f() (18) using GE is,

$$\begin{aligned} \mathbf{A} = \begin{bmatrix} 0&\mathbf{I}&0&0&0 \\ 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&0&0&-\mathbf{R} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix}\,, \end{aligned}$$

with \(\mathbf{V} = -[\mathbf{R}(\mathbf{a}_S-\mathbf{a}_b)]_\times \). Its powers are,

$$\begin{aligned} \mathbf{A}^2 = \begin{bmatrix} 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&0&0&-\mathbf{V}{} \mathbf{R} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix} ~,~ \mathbf{A}^3 = \begin{bmatrix} 0&0&0&0&-\mathbf{V}{} \mathbf{R} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix}\,, \end{aligned}$$

and \(\mathbf{A}^{n} = \mathbf{0}\) for \(n>3\). For LE we have

$$\begin{aligned} { \mathbf{A} = \begin{bmatrix} 0&\mathbf{I}&0&0&0 \\ 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&{\varvec{\Theta }}&0&-\mathbf{I} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix}, \mathbf{A}^2 = \begin{bmatrix} 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&\mathbf{V}{\varvec{\Theta }}&0&-\mathbf{V} \\ 0&0&{\varvec{\Theta }}^2&0&-{\varvec{\Theta }} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix},\cdots } \end{aligned}$$

with \(\mathbf{V} = -\mathbf{R}[\mathbf{a}_S-\mathbf{a}_b]_\times \), and \({\varvec{\Theta }} = -[{\varvec{\omega }}_S-{\varvec{\omega }}_b]_\times \).

The Jacobians \(\mathbf{A}=\partial f(\mathbf{x},\cdot )/\partial \mathbf{x}\) of the continuous-time EKF true—(16) and ESKF nominal—(17) systems are equal to each other, having

$$\begin{aligned} { \mathbf{A} = \begin{bmatrix} 0&\mathbf{I}&0&0&0 \\ 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&\mathbf{W}&0&\mathbf{Q} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix} ~,~ \mathbf{A}^2 = \begin{bmatrix} 0&0&\mathbf{V}&-\mathbf{R}&0 \\ 0&0&\mathbf{V}{} \mathbf{W}&0&\mathbf{V}{} \mathbf{Q} \\ 0&0&\mathbf{W}^2&0&\mathbf{W}{} \mathbf{Q} \\ 0&0&0&0&0 \\ 0&0&0&0&0 \\ \end{bmatrix}} \,,\cdots \end{aligned}$$

where \(\mathbf{V}\), \(\mathbf{W}\) and \(\mathbf{Q}\) are defined by

$$\begin{aligned} \mathbf{V}&= \frac{\partial \mathbf{R}\{\mathbf{q}\}\,(\mathbf{a}_S-\mathbf{a}_b)}{\partial \mathbf{q}} \end{aligned}$$
$$\begin{aligned} \mathbf{W}&= \frac{\partial \frac{1}{2}\mathbf{q}\otimes ({\varvec{\omega }}_S-{\varvec{\omega }}_b)}{\partial \mathbf{q}} \end{aligned}$$
$$\begin{aligned} \mathbf{Q}&= \frac{\partial \frac{1}{2}\mathbf{q}\otimes ({\varvec{\omega }}_S-{\varvec{\omega }}_b)}{\partial {\varvec{\omega }}_b}, \end{aligned}$$

and are developed hereafter. For the first Jacobian \(\mathbf{V}\) it is convenient to recall the derivative of a rotation of a vector \(\mathbf a\) by a quaternion \(\mathbf{q}=[w,x,y,z]^\top =[w,\mathbf{v}]^\top \) with respect to the quaternion,

$$\begin{aligned} \mathbf{V}(\mathbf{q},\mathbf{a})&\triangleq \frac{\partial \mathbf{R}\{\mathbf{q}\}\,\mathbf{a}}{\partial \mathbf q} = \frac{\partial (\mathbf{q}\otimes \mathbf{a}\otimes \mathbf{q}^*)}{\partial \mathbf q}\\&= { 2\big [w \mathbf{a} \!+\! \mathbf{v}\!\times \!\mathbf{a} ~\big |~ \mathbf{v}{} \mathbf{a}^\top \!-\! \mathbf{a}{} \mathbf{v}^\top \!+\! \mathbf{a}^\top \mathbf{v}{} \mathbf{I}_3 \!-\! w[\mathbf{a}]_\times \big ] \nonumber }\,, \end{aligned}$$

having therefore

$$\begin{aligned} \mathbf{V} = \mathbf{V}(\mathbf{q},\,\mathbf{a}_S - \mathbf{a}_b)~. \end{aligned}$$

For the Jacobian \(\mathbf{W}\) we have from (52)

$$\begin{aligned} \mathbf{W} = \frac{1}{2}{\varvec{\Omega }}({\varvec{\omega }}_S-{\varvec{\omega }}_b)~, \end{aligned}$$

with \(\varvec{\Omega }({\varvec{\omega }})\) the skew-symmetric matrix defined in (53).

Finally, for the Jacobian \(\mathbf{Q}\) we use (46), (47) and (49) to obtain

$$\begin{aligned} \mathbf{Q} = - \frac{1}{2} \begin{bmatrix} -x&-y&-z \\ w&-z&y \\ z&w&-x \\ -y&x&w \end{bmatrix}. \end{aligned}$$

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Santamaria-Navarro, A., Loianno, G., Solà, J. et al. Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors. Auton Robot 42, 1263–1280 (2018).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Micro aerial vehicles
  • Vision for robotics
  • Localization