Advertisement

Implementation of Control System and Tracking Objects in a Quadcopter System

  • Siva AriramEmail author
  • Juha Röning
  • Zdzisław Kowalczuk
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11659)

Abstract

In this paper, we implement a quadcopter assembly with control and navigation module. The project also includes the design of the control panel for the operator which consists of a set of the microcontroller and the glove equipped with sensors and buttons. The panel has a touch screen which displays current parameters such as vehicle status, including information about orientation and geographical coordinates. The concept of quadcopter control is based on the movement of the operator hand. In addition, we have included the object detection for detecting the objects from the quadcopter view of point. To detect an object, we need to have some idea of where the object may be and how the image is divided into segments. It creates a kind of chicken and egg problem, where we must recognize the shape (and class) of the object knowing its location and recognize the location of the object knowing its shape. Some visual characteristics such as clothing and the human face, they can be part of the same subject, but it is difficult to recognize this without recognizing the object first.

Keywords

Drone Quadcopter Kalman filter GPS IMU 

References

  1. 1.
    Pedley, M.: Tilt sensing using a three-axis accelerometer. Free. Semicond. Appl. Note 1, 2012–2013 (2013)Google Scholar
  2. 2.
    Romaniuk, S., Gosiewski, Z.: Kalman filter realization for orientation and position estimation on dedicated processor. Acta Mechanica et Aautomatica 8(2), 88–94 (2014)CrossRefGoogle Scholar
  3. 3.
    McCarron, B.: Low-Cost IMU Implementation via Sensor Fusion Algorithms in the Arduino Environment. California Polytechnic State University, San Luis Obispo (2013)Google Scholar
  4. 4.
    Ylimäki, M., et al.: Fast and accurate multi-view reconstruction by multi-stage prioritised matching. IET Comput. Vision 9(4), 576–587 (2015)CrossRefGoogle Scholar
  5. 5.
    Shah, M.K.N., Dutt, M.B.J., Modh, H.: Quadrotor–an unmanned aerial vehicle. Int. J. Eng. Dev. Res. 2(1), 1299–1303 (2014)Google Scholar
  6. 6.
    Kang, M.S., Lim, Y.C.: High performance and fast object detection in road environments. In: 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), pp. 1–6. IEEE (2017)Google Scholar
  7. 7.
    Hentati, A.I., Krichen, L., Fourati, M., Fourati, L.C.: Simulation tools, environments and frameworks for UAV systems performance analysis. In: 2018 14th International Wireless Communications & Mobile Computing Conference (IWCMC), pp. 1495–1500. IEEE (2018)Google Scholar
  8. 8.
    Sagitov, A., Gerasimov, Y.: Towards DJI phantom 4 realistic simulation with gimbal and RC controller in ROS/Gazebo environment. In: 2017 10th International Conference on Developments in eSystems Engineering (DeSE), pp. 262–266. IEEE (2017)Google Scholar
  9. 9.
    Al-Kaff, A., et al.: VBII-UAV: vision-based infrastructure inspection-UAV. In: Rocha, Á., Correia, A.M., Adeli, H., Reis, L.P., Costanzo, S. (eds.) WorldCIST 2017. AISC, vol. 570, pp. 221–231. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-56538-5_24CrossRefGoogle Scholar
  10. 10.
    Gasior, P., et al.: Thrust estimation by fuzzy modeling of coaxial propulsion unit for multirotor UAVs. In: 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), pp. 418–423. IEEE (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Siva Ariram
    • 1
    Email author
  • Juha Röning
    • 1
  • Zdzisław Kowalczuk
    • 2
  1. 1.Biomimetics and Intelligent Systems Group (BISG)University of OuluOuluFinland
  2. 2.Gdansk University of TechnologyGdanskPoland

Personalised recommendations