Physical Systems

  • Jung-Sup Um


The main layers of CPS (Cyber-Physical System) are the virtual layer and physical layer. CPS requires the connection of numerous physical domains and cyber world to support AI (Artificial Intelligence) bridging IoT, Cloud and Big Data. The deep learning procedure of AI requires a numerous degree of complicated data at spatial and temporal scales which closely sense the changing state of the physical world in real-time. For the physical layer, sensors are major elements and an intelligently deployed network of sensors collects information for the real life system. IoT sensors produce vast amounts of spatial information through hundreds of millions of devices connected to people, products, and locations. Spatial data, like energy in the first and second industrial revolutions, is a key factor that determines the competitiveness of the enterprise in the 4th industrial revolution era because learning of artificial intelligence requires location and imagery data as spatial information. This spatial information makes the world we live in as digital twin. This chapter explains the relationship between sensors and spatial information as data essential to train artificial intelligence in terms of CPS such as a self-driving car.


  1. 1.
    Um J-S (2017) Embracing cyber-physical system as cross-platform to enhance fusion-application value of spatial information. Spat Inf Res 25(3):439–447. CrossRefGoogle Scholar
  2. 2.
    Mexico TRotUoN (2009) Introduction to transducers, sensors, and actuators: participant guide. National Science Foundation, Albuquerque, NM, USAGoogle Scholar
  3. 3.
    Hahanov V (2014) Cyber physical system – smart cloud traffic control. Accessed 30 July 2018
  4. 4.
    Mathas C (2011) The five senses of sensors – smell. Digi-Key Electronics. Accessed 28 Sept 2018
  5. 5.
    URISA (2017) Ian McHarg. URISA. Accessed 30 Sept 2018
  6. 6.
    Dasgupta R, Dey S (2013) A comprehensive sensor taxonomy and semantic knowledge representation: Energy meter use case. In: 2013 seventh international conference on sensing technology (ICST), 3–5 December 2013, pp 791–799.
  7. 7.
    White RM (1987) A sensor classification scheme. IEEE Trans Ultrason Ferroelectr Freq Control 34(2):124–126. MathSciNetCrossRefGoogle Scholar
  8. 8.
    Fraden J (2010) Data acquisition. In: Handbook of modern sensors: physics, designs, and applications. Springer, New York, pp 1–12. CrossRefGoogle Scholar
  9. 9.
    Smart Vision Labs (2017) Why vision is the most important sense organ. Accessed 30 Sept 2018
  10. 10.
    Jin X-B, Su T-L, Kong J-L, Bai Y-T, Miao B-B, Dou C (2018) State-of-the-art mobile intelligence: enabling robots to move like humans by estimating mobility with artificial intelligence. Appl Sci 8(3):379CrossRefGoogle Scholar
  11. 11.
    Zhang L, Zhang L, Du B (2016) Deep learning for remote sensing data: a technical tutorial on the state of the art. IEEE Geosci Remote Sens Mag 4(2):22–40. CrossRefGoogle Scholar
  12. 12.
    Wang L, Scott KA, Xu L, Clausi DA (2016) Sea ice concentration estimation during melt from dual-pol SAR scenes using deep convolutional neural networks: a case study. IEEE Trans Geosci Remote Sens 54(8):4524–4533. CrossRefGoogle Scholar
  13. 13.
    Gibson BR, Rogers TT, Zhu X (2013) Human semi-supervised learning. Top Cogn Sci 5(1):132–172. CrossRefGoogle Scholar
  14. 14.
    University of Babylon (2011) Learning in neural networks. University of Babylon, Iraq. Accessed 30 Sept 2018
  15. 15.
    Rystedt B, Konecny M, Virrantaus K, Ormeling F (2003) A strategic plan for the International Cartographic Association. International Cartographic AssociationGoogle Scholar
  16. 16.
    Longley PA, Goodchild MF, Maguire DJ, Rhind DW (2005) Geographic information systems and science. Wiley, HobokenGoogle Scholar
  17. 17.
    Shannon Mattern (2017) Mapping’s intelligent agents. Places J.
  18. 18.
    McHarg IL, Mumford L (1969) Design with nature. American Museum of Natural History, New YorkGoogle Scholar
  19. 19.
    Um J-S, Wright R (1998) A comparative evaluation of video remote sensing and field survey for revegetation monitoring of a pipeline route. Sci Total Environ 215(3):189–207CrossRefGoogle Scholar
  20. 20.
    Um J-S, Wright R (1996) Pipeline construction and reinstatement monitoring: current practice, limitations and the value of airborne videography. Sci Total Environ 186(3):221–230CrossRefGoogle Scholar
  21. 21.
    Riebeek H (2009) Catalog of earth satellite orbits. NASA. Accessed 23 Sept 2018
  22. 22.
    NASA (2009) Three classes of orbit. NASA. Accessed 28 Sept 2018
  23. 23.
    U.S. Geological Survey (2018) Landsat project description. U.S. Geological Survey. Accessed 30 Sept 2018
  24. 24.
    U.S. Geological Survey (2016) Landsat – earth observation satellites. Science for Changing World. U.S. Department of the Interior, U.S. Geological Survey.
  25. 25.
    Coppa I, Woodgate P, Mohamed-Ghouse Z (2016) Global outlook 2016: spatial information industry. Australia and New Zealand Cooperative Research Centre for Spatial Information. MelbourneGoogle Scholar
  26. 26.
    Economist T (2014) Nanosats are go! The EconomistGoogle Scholar
  27. 27.
    Philipson WR (1980) Problem-solving with remote sensing. Photogramm Eng Remote Sens 46(10):1335–1338Google Scholar
  28. 28.
    Heather Kelly (2012) Self-driving cars now legal in California. CNNGoogle Scholar
  29. 29.
    Gomes L (2016) When will Google’s self-driving car really be ready? It depends on where you live and what you mean by “ready” [News]. IEEE Spectr 53(5):13–14. CrossRefGoogle Scholar
  30. 30.
    Eze EC, Zhang S-J, Liu E-J, Eze JC (2016) Advances in vehicular ad-hoc networks (VANETs): challenges and road-map for future development. Int J Autom Comput 13(1):1–18. CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  • Jung-Sup Um
    • 1
  1. 1.Department of GeographyKyungpook National UniversityDaeguKorea (Republic of)

Personalised recommendations