1 Introduction

The recent growing market for geospatial data and its applications has increased the demand for collecting geospatial data efficiently and economically. Mobile mapping technologies, including multi-sensor integration and multi-platform mapping technology, have clearly established a modern framework moving towards efficient geospatial data acquisition for various applications such as conventional mapping scenarios, rapid disaster response, smart city, and autonomous vehicle applications. Among those applications, applying mobile mapping systems to build indoor maps for pedestrian navigation and high-definition (HD) maps for autonomous vehicles are the most popular topics driven by the booming business opportunities in geospatial communities.

Mobile mapping refers to a means of collecting geospatial data using mapping sensors mounted on a moving platform (El-Sheimy 1996). The original idea of adopting mobile mapping technologies was limited to applications that allowed the determination of exterior orientation parameters using existing ground control points. This procedure is known as georeferencing. In fact, the concept of mobile mapping has been rooted in the geomatics communities ever since photogrammetry was adopted. Research concerning mobile mapping was mainly driven by the need for highway infrastructure mapping and transportation corridor inventories in the late 1980s (El-Sheimy 1996).

Over the next decades, advances in satellite navigation and inertial sensing technology altered the development of mobile mapping in a different way. The trajectory and attitude of the mobile mapper are now determined directly, instead of using ground control points as references for positioning and orienting the images in space. The determination of time-variable position and orientation parameters for a mobile digital imager is known as direct geo-referencing (DG), which is the core ingredient of modern mobile mapping technology (El-Sheimy 1996). Figure 25.1 illustrates the evolution of georeferencing technology over the past decades.

Fig. 25.1
figure 1

The evolution of georeferencing technology

Cameras and laser scanners or light detection and ranging (LiDAR), along with positioning and orientation sensors, are integrated and mounted on a moving platform for mapping purposes. Objects of interest can be directly measured and mapped from georeferenced images or point clouds. The most common technologies used for this purpose today are satellite positioning using global navigation satellite systems (GNSS) and inertial navigation using an inertial measuring unit (IMU). They are usually integrated to provide seamless time-variable position and orientation parameters for mobile mapping systems. Figure 25.2 illustrates the scope of mobile mapping technology, including components, platforms, and applications, respectively. Figure 25.3 illustrates the example of sensors applied by an image-based mobile mapping system and their functions, respectively.

Fig. 25.2
figure 2

The scope of mobile mapping technologies

Fig. 25.3
figure 3

Sensor functions

2 Roadmap of Mobile Mapping Technologies

Pilot demonstrations of land-based mobile mapping technology date back to the demand for a mobile highway inventory system (MHIS) proposed by some Canadian provincial governments and US state governments in the early 1980s. Since the 1980s, at least 1000 land-based mobile mapping systems (including street-view cars) are currently practicing around the world to perform rapid geospatial information acquisition for various applications. The important milestones in this process can be divided into three stages: The first stage is the pre-INS period, from 1983 to 1993; the second stage is the post-INS period, from 1993 to 2000, and the last stage is the LiDAR period, from 2000 to the present. To meet the demands of different users, land-based mobile mapping technology has changed significantly in terms of its positioning and orientation systems over the past 30 years. The first representative system of the pre-INS era is the Alberta MHIS developed jointly by the Alberta Government of Canada and the University of Calgary (Schwarz and El-Sheimy 2008). Early land-based mobile mapping technology adopted dead-reckoning sensors such as gyroscopes, accelerometers, and odometers to derive positioning solutions using the principle of relative positioning, where in the 1980s, the imaging sensors utilized were mostly analog cameras. The images taken recorded the status of the road facilities and provide near-real-time road information for maintenance agencies. The second representative system during this period was a land-based mobile mapping system called GPSVan from the Center for Mapping at The Ohio State University. The system used the Global Positioning System (GPS) and odometers to provide navigation parameters, as illustrated in Fig. 25.4. The primary imaging sensors were two cameras that could continuously capture stereo pairs. The three-dimensional coordinates of the features were obtained by the principle of close-range photogrammetry. The positioning accuracy of GPSVan was 0.3–3 m (Grejner-Brzezinska 2001).

Fig. 25.4
figure 4

The first land-based mobile mapping technology

The representative system of the post-INS era was the VISAT series developed by the University of Calgary, Canada. The school has been developing land-based mobile mapping technology for nearly 40 years. First, the INS/GPS system was successfully integrated into the Alberta MHIS in 1994. The first generation of mobile mapping technology architecture, called the first generation of VISAT Van (Shin 2005), is shown in Fig. 25.4.

The second generation of VISAT was a complete architecture, for example, INS/GPS integrated systems, odometers, and color charge-coupled device (CCD) cameras (El-Sheimy 1996). This system was the first in the world to introduce a navigation-grade INS (a gyro drift of less than 0.01°/h) using a ring laser gyroscope (RLG) with a positioning accuracy of 0.1–1 m. The system features an adjustable shooting interval at high moving speed (100 km/h). The LiDAR period began in the 2000s, and compared to the mobile mapping technology in the first two stages, the primary difference is the addition of LiDAR in the imaging-sensor component. Numerous geospatial information-associated companies around the world, such as Google, Apple, and their competitors, are adopting mobile mapping technology and building a solid digital foundation of countless exciting applications driven by geospatial information for the coming decades.

In addition to Google’s sustained development of various applications based on Street View technology, Apple also implemented the development of independent mobile mapping technology in 2014 and developed the exclusive Apple Van to catch up with the progress of Google’s geospatial information technology. At the same time, Finland’s Nokia-funded world-class navigation map maker, Here, also developed its own mobile mapping technology, which was also acquired by Germany’s three major automakers to produce accurate navigation maps to meet the demands of the automotive industry. Even Toyota exhibited a map-production technology for passenger cars at CES 2016. Therefore, mobile mapping technology plays an important role in the development of autonomous driving technology as it provides the digital world to meet the navigation safety requirements of future autonomous-vehicle applications.

The development of airborne mobile mapping technology dates back to the early 1990s, similar to the development of land-based mobile mapping technology. The important milestones can be divided into three stages as well: The first stage is the pre-INS period, from 1985 to 1995; the second stage is the post-INS period, from 1995 to 2000; and the last stage is the LiDAR period, from 2000 to the present. In the pre-INS period, many researchers in Europe and America proposed providing the orientation parameters for aircraft using a GPS multi-antenna array (Cohen and Parkinson 1992; El-Mowafy and Schwarz 1994), but the accuracy provided (0.1–0.03°) was limited by the baseline (2–10 m) of the multi-antenna array placed on the aerial survey aircraft and the solution of the GPS integer ambiguity values.

Since the early 1990s, many researchers in Europe and the United States have recognized the necessity of INS for the development of airborne mobile mapping technology (Cannon and Schwarz 1990). The earliest configuration of airborne mobile mapping technology with an INS was developed by the Department of Geomatics Engineering at the University of Calgary, Canada (Skaloud et al. 1996). Its DG accuracy without using ground control points was about 30–40 cm. The reason why the development of the airborne system lags behind the land-based system is the acquisition of high-precision INS. Most of the land-based systems developed in the early 1990s applied odometers and gyroscopes, while the demand for accurate orientation parameters using an INS for an airborne system is higher than that of a land-based system. The first land-based system using an INS was deployed in 1993. Therefore, it is not difficult to understand why the development of airborne mobile mapping technology was slightly behind that of land-based systems.

At the same time, the Center for Mapping at Ohio State University developed a similar Airborne Integrated Mapping System (AIMS) in 1998 with a DG accuracy of about 20–30 cm (Grejner-Brzezinska 2001). The operational flexibility of the DG mode was greatly enhanced, and its practical costs were considerably reduced, especially in applications where few or no ground control points are available for airborne applications. Ip et al. (2004) combined the traditional aerial triangulation using ground control points and DG to develop an integrated sensor orientation (ISO) procedure to improve the stability of airborne mobile mapping systems using limited ground control points. The last stage is the LiDAR period. Compared to the first two stages of the airborne mobile mapping technology, the main difference is the addition of a LiDAR system as an additional imaging sensor. The earliest experiment on airborne scanners dates back to the 1970s and 1980s, but only since the maturity of the data-processing and hardware technologies related to LiDAR- and INS/GPS-integrated positioning and orientation systems, have such airborne mobile mapping systems been widely applied in geomatics communities since 1996 (Axelsson 1999).

However, there are some limitations to conventional airborne mobile mapping systems. The expenses for practicing aerial photogrammetry are high, and there are strict regulations for the permits necessary to practice airborne surveys in most countries. Numerous studies have been conducted to adopt unmanned aerial vehicles (UAVs) for photogrammetry applications. For small and remote-area mapping, UAVs provide an appropriate and inexpensive platform, especially in developing countries. In recent years, more and more UAV-based photogrammetric platforms have been developed, and their performance has been proven in certain scenarios (Chiang et al. 2012).

Nagai et al. (2008) first proposed a UAV-borne mapping system using an unmanned helicopter as the platform equipped with an INS/GPS system to facilitate the DG capability, as shown in Fig. 25.5.

Fig. 25.5
figure 5

Adopted from Nagai et al. (2008, p. 1217)

An example of a DG-ready UAV helicopter-based photogrammetric platform.

Chiang et al. (2012) developed a DG-based UAV photogrammetric platform where an INS/GPS integrated POS system was implemented to provide the DG capability of the platform. Rehak et al. (2013) developed a low-cost UAV for direct geo-referencing. The advantage of such a system lies in its high maneuverability and operation flexibility as well as its ability to acquire image data without the need to establish GCPs.

Chiang et al. (2017) proposed a LiDAR-based unmanned aerial vehicle (UAV). The UAV integrates an IMU, a GNSS receiver, and low-cost LiDAR, as illustrated in Fig. 25.6. An unmanned helicopter was introduced, and a multi-sensor payload architecture for direct georeferencing was designed to improve the capabilities of the vehicle.

Fig. 25.6
figure 6

An unmanned helicopter-based LiDAR mapping system

The development of shipborne mobile mapping technology dates back to 2005 (Zach et al. 2011). Its primary system architecture follows that of the land-based mobile mapping system and adds a stabilizer function to overcome the walrus’ accuracy. Zach et al. (2011) applied a shipborne system using the RIGEL VMX-250 with a GNSS receiver and a tactical-grade IMU to scan the relevant monuments along a canal in Venice, Italy. The objects on both sides of the river were scanned and recorded along the driving track.

The development of portable mobile mapping technology can be traced back to the early 2000s. The Department of Geomatics Engineering at the University of Calgary in Canada developed a prototype of a lightweight and low-cost personal mobile mapping system. The DG horizontal positioning accuracy of the system without control points was about 20 cm, and the vertical positioning accuracy was about 10 cm (Ellum 2001). This prototype utilized a digital magnetic compass instead of an IMU to provide attitude information; however, a digital magnetic compass is vulnerable to magnetic-field interference in urban areas and is unstable (Ellum 2001). A portable mapping system is especially beneficial for disaster response applications. The disadvantage of a land-based system is the discontinuity of image acquisition due to the limitations of road-network connections in some narrow lanes. Therefore, portable mobile mapping systems are designed to cope with such situations, as illustrated in Fig. 25.7.

Fig. 25.7
figure 7

Example of portable mobile mapping systems

3 Recent Progress on Mobile Mapping Technology

A mobile mapping system comprises digital imaging systems, positioning and orientation systems, and various practicing platforms and application scenarios, as illustrated in Fig. 25.1. On the other hand, the development, hardware cost, and accuracy requirements of mobile mapping systems are highly correlated. In recent years, due to increasing demand for automation of mapping processes in the geospatial information industry, mobile mapping systems have gradually become commercially viable products, since the prototype development stage performed by professional research institutions before 2005 enabled an innovative solution in the geospatial information industry. Besides, the robotics industry also extensively applies similar concepts and sensors to develop perception technologies to navigate robots in unknown environments. Compared with the current mobile mapping systems developed by the geospatial information industry, the environmental perception technology developed by the robotic industry has the advantage of low prices, but its accuracy is not sufficient to meet the demands for geospatial applications. The development of mobile mapping technology in these two areas will definitely stimulate a lot of interest and further expand the penetration of geospatial information in other communities. Therefore, mobile mapping technology will continue to evolve based on the fundamental requirements of users, who are pursuing lower hardware costs, higher accuracy, and higher profits. Therefore, future development trends can be discussed according to the evolution of different levels of digital imaging systems, positioning and orientation systems, different operating platforms, and application scenarios.

3.1 Digital Imaging Systems

Current mobile mapping systems have fully adopted image sensors for digital electronic components. These image sensors include digital cameras using image frames, multi-spectral line scanners using line-scan technology, and optical and IFSAR/INSAR. The development of mobile mapping systems is closely related to the progress of digital imaging technology. Among imaging sensors, the evolution of image-based digital cameras has played the most important role. These cameras are in line with the development of LiDAR mobile mapping systems, but due to the limited resolution of CCD cameras used in the 1990s, these CCD digital cameras were used for land-based mapping systems because the distance of effective measurement in a land-based scenario is much smaller than the altitude requirements of airborne applications.

In recent years, the resolution and image size of CCD cameras have gradually improved. Numerous high-performance digital reflex cameras with single lenses have been developed and tested for airborne mobile mapping systems, and the results are quite encouraging. The advantages of using a digital camera are obvious. The user does not have to scan a film negative to improve mapping efficiency; the digital image processing technology improves the automation of feature extraction, and the updating and storage of digital images are easier.

In the evolution of these digital imaging systems, the IFSAR airborne mapping system has received more attention in the geospatial information community in recent years (see Chap. 21). It is characterized by rapid deployment, a nearly weather-free operation mode, and effective penetration of clouds. Another important development of digital imaging technology with an airborne mapping system is the airborne hyperspectral imaging system. Through a combination of different spectral images, many important features can be derived to provide environmental monitoring, mining exploration, vegetation inspection, disaster prevention, and land-resource management.

Recently, sensors adopted in low-cost mobile mapping systems have gradually been replaced by Kinect’s depth cameras. For indoor scenes, such systems have the advantage of being low cost and offering mass production for the consumer market. Google and Apple are competing to develop inertial sensing, depth cameras, and CCD cameras to create indoor 3D models with mobile devices.

3.2 Positioning and Orientation Systems

GPS is a navigation satellite positioning system developed by the United States in the late 1970s. Currently, 32 satellites operate in orbits about 20,000 km from the Earth’s surface. Since the design has been around for 30 years, the United States has implemented a GPS modernization plan, adding new, improved quality measurements to meet the demands of the coming years. More importantly, the GPS modernization plan upgrades the original dual-frequency system to a tri-frequency system.

In 2001, the Russian government decided to continue to maintain the operation of GLONASS and proposed a plan similar to GPS modernization. The program added 24 new satellites by the end of 2010 in order to provide accurate navigation services worldwide. Like the modernized GPS, the future GLONASS can provide tri-frequency civilian signals for accurate positioning, navigation, and time-related applications.

The Beidou Navigation Satellite System is the GNSS developed by China. It is committed to providing fine-precision positioning, navigation, and time services to users around the world, and can further provide services to authorized users with high accuracy requirements for both military and civilian users.

The Galileo system is the GNSS built by the European Union. After the US GPS, Russia’s GLONASS, and China’s Beidou system, it is the fourth system to provide civilian global satellite navigation services. The primary purpose of the Galileo system is to provide civilian navigation, which is different from the three systems mentioned earlier.

The GPS Block IIF satellites and the new generations of GPS III that are currently being launched are capable of transmitting tri-frequency signals, and the GLONASS-M and the GLONASS-K introduced after 2014 have also added the third frequency. After the completion of the Galileo and Beidou systems, the multi-frequency observation using multi-system GNSS is bound to bring higher satellite visibility and improved accuracy to mobile mappers around the world. In the future, whether it is real-time kinematic positioning for navigation purposes or post-processing kinematic or static-baseline solutions for geodetic requirements, users can use multi-system GNSS receivers to enjoy better positioning results. It is expected that after 2020, a general user will be able to use the multi-frequency measurements provided by GNSS to achieve improved positioning accuracy.

At present, e-GPS or e-RTK technology for kinematic positioning with virtual reference stations has been widely used in the geomatics community. For mobile mapping applications, the real-time information transfer for high-speed motion platforms required by RTK is a challenge; therefore, e-GPS or e-RTK is not a viable option for mobile mapping applications at the present time. Therefore, in the future, in terms of the multi-sensor positioning and orientation software used in the mobile mapping system, determining how to achieve differential kinematic positioning using GNSS virtual reference stations in the post-processing architecture is an important issue.

The development of the mobile mapping system was highly correlated with the development of strapdown inertial sensing technology. From a DG perspective, there would be no booming mobile-mapping-related industries without the advancement of inertial sensing technology. In principle, an IMU has three gyroscopes and accelerometers, and it provides compensated raw measurements, including velocity changes and orientation changes in three directions of its body frame. Those who require real-time navigation solutions with the use of an IMU require an external computer that has inertial navigation mechanization algorithms. On the other hand, an INS is an IMU combined with a navigation computer to provide navigation solutions in the chosen navigation frame directly in real-time. In addition, it also provides compensated raw measurements. Therefore, the main distinction between an IMU and INS is the ability to provide real-time navigation solutions. The former only provides compensated inertial measurements while the latter can provide real-time navigation solutions as well as compensated inertial measurements.

For mobile mapping system applications, the standard operating procedure in the calculation of the precise positioning and orientation solution through the post-processing procedure. Taking the same measurements as an example, in the same GNSS signal outage period, the positioning accuracy obtained by the post-processing software using smoothing algorithms is nearly 60% better than the real-time solution with filtering algorithms. Therefore, the IMU is suitable for mobile mapping applications.

In recent years, the rapid evolution of inertial sensing technology using micro-electro-mechanical systems (MEMS) has led to another advance in the sustainable development of mobile mapping technology. The MEMS IMU is low cost and provides acceptable performance compared to an IMU with a fiber optic gyroscope (FOG) with the same specifications. The price is only one half of its counterpart with FOG, and the stability of the MEMS IMU will continue to improve over time. At present, MEMS IMUs with gyroscopes with a drift of 0.5°/h are available for mobile mapping applications.

3.3 Sensor Fusion Algorithms

The Kalman filter (KF) approach has been widely recognized as the standard optimal estimation tool for current sensor-fusion schemes. However, the major inadequacy related to the utilization of KF for sensor fusion is the necessity to have a predefined accurate stochastic model for each of the sensor errors. Furthermore, prior information about the covariance values of each sensor measurement as well as the statistical properties (i.e., the variance and the correlation time) of each sensor system must be accurately known (Schwarz and El-Sheimy 2008). Furthermore, for mobile mapping applications (where the process and measurement models are nonlinear), the extended Kalman filter (EKF) operates under the assumption that the state variables behave as Gaussian random variables. Naturally, the EKF may also work for nonlinear dynamic systems with non-Gaussian distributions, except in the case of heavily skewed nonlinear dynamic systems, where the EKF may experience problems (Chiang et al. 2009).

When compared to real-time filtering, post-processing has the advantage of utilizing an entire data set to estimate a trajectory. This is not possible when using filtering because only a fraction of the data is available at each sample instance. When filtering is used in the first step, an optimal smoothing method, such as a Rauch-Tung-Striebel (RTS) backward smoother, can be applied (Chiang et al. 2009). For most of the surveying applications that require superior accuracy, only data acquisition has to be implemented in real-time, and data processing and analysis are post-processed. The procedures for general mobile mapping applications include data acquisition, georeferencing, measurement, and GIS processing. Only real-time data acquisition is desired for acquiring IMU, GNSS, CCD image data, and LiDAR point clouds. For georeferencing processes that put position and orientation stamps on images, and measurement processes that obtain 3-D coordinates of all important features and store them in a GIS database, only post-mission processing can be implemented based on the accuracy requirements of these processes (El-Sheimy 1996).

According to Chiang et al. (2009), the development of the multi-sensor fusion algorithms for mobile mapping applications can be divided into the following categories:

  • Sampling filter approach: The main feature is to establish an error dynamic model and sensor error model based on the statistical characteristics according to the concept of the traditional KF; the nonlinear INS/GNSS integration problem is linearized when the KF is used. On the contrary, most of these new sampling filter algorithms use nonlinear models to deal with navigation and positioning problems. The traditional KF provides the best solution for the approximate model, and such sampling filters can provide approximate solutions for accurate models.

  • Artificial intelligence approach: The main common feature of such algorithms is to establish nonlinearity by imitating human learning, where the dynamic models are approximated with artificial intelligence.

  • Hybrid approach: Such fusion algorithms mainly combine the current KF smoother-based algorithms with AI to develop a hybrid algorithm.

3.4 Collaborative Mobile Mapping Schemes

The shortcomings of airborne mobile mapping technologies are similar to those of traditional aerial survey technologies such as weather dependence and limitations related to operating ranges. Compared to traditional surveying technologies, land-based mobile mapping technologies are less intrusive and provide better efficiency in geospatial information acquisition. While the land-based mobile mapping system can operate under poor weather conditions, it is sensitive to the quality of the GNSS signal, and its operating environment is also limited by the existing road network. The mobility of portable mobile mapping technology is much higher than the other two referred to above, and it has better operating flexibility.

Land-based mobile mapping systems can conduct control surveying, surface feature collection, rapid mapping, and image-database updating. The ability to directly georeferencing an image with an airborne mobile mapping system can provide the features of the surface entities under observation. Through the images provided by the vehicle, the user can quickly complete the mapping process and establish a large volume of attribute data required by the GIS for further analysis. At the same time, the portable system provides fast property updates to maintain the correctness of terrain features and database properties. In other words, mobile mapping technologies with collaborative mapping schemes are able to complete the mapping process rapidly, compared to a large amount of manpower and cost required to perform the same task using an aerial survey or geodetic survey. Therefore, the savings in manpower and operational costs are considerable with collaborative mobile mapping schemes. Figure 25.8 illustrates an example of collaborative mobile mapping with airborne and land-based mobile mapping technologies.

Fig. 25.8
figure 8

An example of collaborative mobile mapping

3.5 Mobile Mapping Technology for Rapid Disaster Response Applications

In recent years, numerous natural disasters have occurred due to drastic climate changes at the global level. It is very important to rapidly obtain geospatial information in disaster areas to provide subsequent analysis and decision-making. In this situation, collaborative mobile mapping technology can provide sufficient capacity to solve this problem. Therefore, the development of low-cost, high-mobility mapping systems for timely intelligence acquisition and processing for disaster response is an attractive research theme among the geomatics community.

Satellite imagery has many limitations, such as weather conditions, overlap percentages, spatial and temporal resolution, and price. Aerial vehicles such as airplanes, helicopters, hot air balloons, and unmanned aircraft are relatively inexpensive options, especially with the recent development of airborne mobile mapping technology. Unmanned aerial mobile-mapping systems have high mobility in small areas. In the case of post-disaster rescue and assessment, they can be used to provide timely information that is necessary to cope with emergency situations. Today, high-resolution satellite imagery is still used to improve disaster response and relief. However, unmanned aerial vehicles are the best choice for small-area surveys, especially in developing countries.

On the one hand, mobile devices are popular, and their built-in sensors are quite suitable for certain mobile mapping applications. They usually include GNSS receivers, IMUs, and high-definition cameras. Mobile devices have the advantages of being low cost and popular compared to the classic mobile mapping systems, thus providing considerable convenience for rapid data acquisition missions, as shown in Fig. 25.9. The achievable 2D positioning accuracy of the smartphone mobile mapping system shown in Fig. 25.9 using commercial smartphones is around 1 m with object distances ranging from 10 to 15 m.

Fig. 25.9
figure 9

Smartphone mobile mapping technology

Such devices are suitable for disaster response applications with low accuracy requirements because their high penetration rate can efficiently accelerate disaster relief efforts. Therefore, future of mobile mapping technologies utilizing mobile devices will have considerable economic benefits and business potential.

3.6 Mobile Mapping Technology for Indoor Mapping Applications

Geospatial information is becoming increasingly popular with the penetration of mobile devices into daily life. With the expanding demands of location-based services (LBS), the geospatial information industry’s attention is shifting from outdoor to indoor environments. In buildings, more business opportunities can be discovered at the same time. Google, Microsoft, and their competitors around the world are showing high interest in indoor mapping and navigation applications. Google is currently implementing indoor business maps in the United States, Australia, Japan, and Taiwan, which has aroused high interest within the industry. However, the biggest technical challenge of indoor mapping systems lies in the lack of a unified source of maps, unlike an outdoor map, which can be obtained through the existing collaborative mobile mapping systems. Another major problem is the frequency of updating indoor maps. For example, counters in department stores change frequently, resulting in maintenance difficulties. The main methods of building indoor maps include the use of architectural blueprints or traditional surveying processes, but this method is time-consuming and laborious, and it is difficult to achieve the relevant standards. Therefore, the application of collaborative mobile mapping can be extended to the development of indoor mobile mapping technologies, such as the use of pedestrians and strollers as platforms for indoor mapping applications. Figure 25.10 illustrates a map of indoor parking lots produced with an indoor mapping cart that has electric power. The 3D positioning accuracy of this map is 30 cm.

Fig. 25.10
figure 10

Indoor mobile mapping technology

In addition, LiDAR-based indoor mapping platforms can be applied for underground environmental exploration in the field of mining as well as underground facility inspections.

3.7 Mobile Mapping Technology for Autonomous Vehicle Applications

Autonomous driving vehicles, or self-driving cars, have made enormous progress in recent years. According to the classification method proposed by the Society of Automotive Engineers (SAE) International, the driving system can be divided into six levels. The first level (Level 0) is the most primitive system. The driver controls the mechanical and physical functions of the vehicle without any automatic driving intervention. In order to improve the overall driving feeling and driving safety, individual functions or devices, such as the electronic stability program (ESP) or anti-lock braking system (ABS), are added to improve driving safety. This system can be upgraded to Level 1; high-intermediate model vehicles are mainly controlled by the driver, but additional automation functions are added to reduce the user’s operating burden. For example, the adaptive cruise control (ACC) system automatically adjusts a safe distance from vehicles ahead and warns about lane departures. The autonomous emergency braking (AEB) system combines blind-spot detection and the technologies of the collision avoidance system to reduce vehicle accidents caused by collisions. The system belongs to Level 2. Level 3 is conditional automation, that is, the driver must still be involved at any time in case of emergency; Level 4 or above is a fully automated driving category; and Level 5 has the best car communication system for communication between vehicles. However, in order to achieve a fully autonomous driving level, self-driving cars still face the following three major challenges:

  • Autonomous vehicles must know their location and navigation information

  • Overcoming the problem of in-vehicle sensors on autonomous vehicles that cannot be perceived due to obscuration or distance

  • Connecting the autonomous vehicles with other vehicles to ensure road safety.

In order to achieve Level 4 or higher functional safety, obtaining the precise position information of the vehicle on the road is the most basic requirement for autonomous vehicles to be able to drive on the correct road in a known environment. In addition, according to advanced vehicle-safety research, if navigation equipment needs to be upgraded to the level of autonomous driving, it is necessary to improve the navigation accuracy of the vehicle to the sub-meter level or higher. Due to the limited shading or reflection of satellite reception in urban areas, autonomous vehicles cannot be accurately positioned in the right lane. With advances in computing and sensor technologies, onboard systems, the integrated system of cameras, LiDAR, GNSS, INS, and other perception sensors, can deal with a large amount of data and achieve real-time processes continuously and accurately. These systems also handle several specialized functional schemes such as positioning, mapping, perception, motion planning, and control. These key components are essential for the vehicle to achieve fully autonomous operation. On the other hand, taking the safety and hardware costs into considerations, the maps with navigation information for autonomous vehicles can provide reliable and robust prior information on the environment. The maps are called HD maps and are essential for the operation of autonomous driving technology.

Compared with the 2D digital navigation maps based on human visual viewpoints, autonomous vehicles need to make real-time decisions through map feedback during driving to allow passengers to reach their destinations safely. HD maps provide detailed map information for navigating autonomous vehicles to ensure navigation safety. The map itself serves as an additional pseudo-sensor in the car and significantly enhances the performance and accuracy of the perception and positioning algorithms necessary for the vehicle to drive autonomously. The difference between HD maps and current 2D digital navigation maps is that the use of the map is transferred from a person to a machine. The mapping accuracy and the road attributes on the map, and even the geometrical relationships of lanes, traffic signs, and roads, must be precisely defined to meet the safety requirements of autonomous vehicles. Thus, the current mapping specifications for producing navigation maps can no longer meet the needs of production, maintenance, and inspection in the case of HD maps. The conditions and definitions required for HD maps are given below:

  • HD maps need to achieve sub-meter accuracy or better.

  • All map information must be in 3D with sufficient accuracy.

  • Features (including lanes, road boundaries, traffic signs, etc.) in the real world must be clearly defined on the map, and detailed attribute data should be attached.

  • The scale of the HD maps must be consistent with the real world; that is, there can be no tolerance for scale problems.

  • The maps must provide dynamic map information for the vehicle to make driving decisions.

Thus, the navigation system can accurately guide the vehicle and handle the situation, such as the non-planar places, viaducts, and underpasses. Figure 25.11 shows the difference and accuracy requirements of the digital map used by the land vehicle system, the ADAS map used by the advanced driver assistance system, HD maps for autonomous vehicles, and the requirements of accuracy.

Fig. 25.11
figure 11

Difference between existing navigation maps and HD maps

To produce HD maps, multi-sensor integration schemes are necessary to perceive the surrounding scenes, which can be divided into active and passive sensing components. Active components will actively emit laser waves to obtain the distance from the target. As in LiDAR and radar, it is more limited in terms of range but is less sensitive to the external environment. Passive sensors only need to receive external information, such as integrated navigation devices with GNSS, IMU, and visual odometers that use cameras to navigate. Multi-sensor integrated schemes are most commonly used in stationary terrestrial laser scanners (STLSs), mobile terrestrial laser scanners (MTLSs), and aerial laser scanners (ALSs). Their characteristics are illustrated in Table 25.1. Among them, the accuracy of STLS is consistent with HD map production, but the cost of practicing mapping and collecting road information in a large area with STLS is too high; the ALS can be free of road obstacles to complete the collection of urban HD maps, but it is still dangerous to fly in cities with a lot of high-rise buildings, and its resolution is not sufficient for producing HD maps; therefore, the most suitable option for an HD map production scheme is MTLS. Google, Apple, Here, and their competitors around the world are applying land-based mobile mapping technologies with MTLS to map the high definition digital world for autonomous vehicles (Fig. 25.12).

Table 25.1 Sensor matrices for building HD maps (after Farrell et al. 2016)
Fig. 25.12
figure 12

HD map production with mobile mapping technology (Chiang et al. 2019)

3.8 The Latest Developments of HD Maps for Autonomous Driving Applications in Taiwan

The 3D coordinates of lane markers, traffic signs, and other relevant parameters, such as curvature and slope, in HD maps, are essential for controlling driving behavior. They are the last reference information when the vision or radar-based vehicle environment sensing systems are failed. Moreover, they provide important multiple guarantees for the safe driving of vehicles. When machines surpass humans’ ability to sense, reason, make decisions in real-time, and artificial intelligence technology guides vehicles safely and comfortably, then HD Maps may not be needed in the long run. However, it is necessary to be aware of the navigation, research, and development of autonomous vehicles through HD maps at the present time. Table 25.2 illustrates the list of autonomous driving classifications, required map types, and accuracy requirements according to Fig. 25.11 and the SAE classification of the driving system, respectively.

Table 25.2 Classification and map types requirement of autonomous driving

In terms of industry trends, since the huge business opportunities of autonomous driving and mapping technologies are promising in the future, international manufacturers have successively conducted preliminary arrangement competitions. In addition to Google’s continued development of various applications based on Street View technology, Apple also implemented its own development of mobile mapping technology in 2014 and developed an exclusive Apple Van to complement its disadvantages in spatial information compared to Google. The original mapping company HERE, owned by Nokia of Finland, supplies a chain of products and services that includes data collection, a map information office, and user map design. It has more than 300 surveying and mapping vehicles in the world to synchronously generate HD maps. It is the main map supplier to traditional car manufacturers such as BMW, Benz, Audi, for the development of autonomous driving technology. One of the map suppliers, TomTom, has more than 150 countries worldwide with vehicle graphics resources totaling more than 60 million kilometers, which includes existing business areas such as map authorization and cooperation with the automotive industry. In recent years, TomTom has focused on the production of HD maps based on the needs of autonomous driving navigation technology, and has proposed 3D mapping technology known as RoadDNA to construct and update HD maps. In Japan, with the support of the resources of the national government, a dynamic mapping platform (DMP) was established by the electronic information industry in partnership with domestic automakers to quickly achieve the demands of HD maps for the automotive industry in Japan. To sum up, at present, major international mapping companies and car manufacturers utilize MMS to generate HD maps based on their mapping technology and autonomous driving technology requirements.

The Department of Land Administration of the Ministry of the Interior in Taiwan proposes the Taiwan HD maps infrastructure that consists of three major pillars including qualified point clouds, qualified digital vector maps, and a Taiwan HD map format composed of the Opendrive format with local extension modules. In addition to the concept of an open base map, this architecture possesses interoperability between various HD map formats, as it is designed to provide map makers and autonomous driving operators with an exchange format to facilitate added-value applications for the conversion to specific formats used by different autonomous vehicle platforms. In addition, it is also designed to support non-autonomous driving applications, such as disaster prevention, asset management, and the traditional surveying and mapping industry through verified fine-precision point clouds and diversified vector layer designs to achieve the concept of data sharing. Figure 25.13 illustrates the overall structure of Taiwan HD maps as well as certain formats used by different end-users (Chiang et al. 2019).

Fig. 25.13
figure 13

The construction of Taiwan HD maps

Currently, most of the Taiwanese autonomous driving platforms apply HD maps from Autoware, developed by the Tier 4 Company in Japan as well as the Open drive format. Therefore the Department of Land Administration of the Ministry of the Interior has been producing two HD map formats, Taiwan HD map format, and Autoware map format, for two primary autonomous vehicle test facilities in Taiwan in order to meet the growing demands for HD maps from various end-users. At the same time, the conversion tools between Taiwan HD Maps and certain end-user formats listed in Fig. 25.13 are also under development by the Land Department of the Ministry of the Interior (Chiang et al. 2019).

The scenarios for HD maps applications in Taiwan are proposed based on the concept of a local dynamic map (LDM; Shimada et al. 2015), as shown in Fig. 25.14. The exchange of time data (such as the signal transformation of traffic lights) and geospatial data (such as GNSS location information) of traffic participants can provide real-time information through communication sensors to improve the safety, efficiency, and comfort of the transportation system, and reduce the impact of traffic on the environment. This allows for the integration of static, temporary, and dynamic traffic information and the input of data with time-stamped and geo-referenced information into LDM as an integrated platform.

Fig. 25.14
figure 14

The scenario of HD maps application

The LDM is a database that integrates real-time autonomous vehicles and traffic information into HD maps to achieve dynamic map data sharing. The meaning of local derives from the demand for geospatial information for the autonomous vehicle since it is close to the points of interest; the meaning of dynamic derives from the requirements of using dynamic traffic information to avoid collisions in a very short time. Therefore, the data requires the timestamp; the meaning of map depends on the association with a map. Local dynamic maps contain (Shimada et al. 2015):

  • Static information and permanent static data: The first layer comes from geographic information system (GIS) map providers, including roads, lanes, intersections, road signs, traffic signs, road facilities, and points of interest (POI), phase data, and building location information, which are created by using a professional mobile mapping system. Update frequency is at least once a month.

  • Semi-static information and transient static data: This layer mainly contains information about roadside infrastructure, including traffic regulations, traffic control schedules, road engineering traffic attributes, and area weather forecasts provided by the road-traffic control department. The information is obtained from outside the autonomous vehicle. Updating frequency of this information is at least once an hour.

  • Semi-dynamic information and transient dynamic data: This mainly includes temporary regional traffic information, traffic control information, accident information, congestion information, phase conditions of traffic lights on roads or traffic signs, and local weather. The information is obtained from outside the autonomous vehicle. Updating frequency is at least once per minute.

  • Dynamic information (highly dynamic data): This layer contains information detected by dynamic communication node V2X information, real-time status information such as traffic participants, surrounding vehicles, pedestrians, and the timing of traffic signals. The information is updated in real-time. Dynamic information is composed of the environmental information and the road-ahead information provided by an intelligent transportation system (ITS).

In order to extend the spectrum of local development in the mapping and autonomous driving market, it is urgent to establish autonomous vehicle testing facilities and implement a unified HD maps format standard and regulation in Taiwan. The format standard for the static HD map layer is the primary task at the present time in Taiwan. The ultimate task is to build a static map to provide rich semantic information with sufficient accuracy to restrict and control vehicle behavior. This mainly includes the lane network, transportation facilities, the road network, and the positioning layer. Therefore, the Land Department of the Ministry of the Interior proposes to implement the production process of static layers of Taiwan HD maps using a professional mobile mapping system, as shown in Fig. 25.15, to meet the requirements for the production, maintenance, verification, and correctness according to “HD Maps Field Practice Guidelines v2,” “Quality Verification Guidelines for HD Maps,” and “HD Maps Data Contents and Formats Standard,” to be published by the Taiwan Association of Information and Communication Standards soon.

Fig. 25.15
figure 15

Taiwan HD maps production procedure

Meanwhile, the applicability of HD maps is further evaluated by autonomous vehicle simulators and real vehicles to further ensure that the Taiwan HD maps format standards and services satisfy the requirements of autonomous vehicle applications in Taiwan and are in line with international standards (Chiang et al. 2019).

4 Future Trends in Mobile Mapping Technology

The recent big data market boom and deep-learning-related applications have been fueled by geospatial intelligence. Thus the importance of multi-platform mobile mapping technologies is being recognized by various communities. In fact, the widespread of mobile mapping technologies among various communities, such as the geospatial, robotics, computer vision, artificial intelligence, and navigation communities, is exceeding the expectation of the pioneers from the geospatial community who initially developed such technologies thirty years ago and continue to promote them even now.

Geospatial data are collected with mapping sensors mounted on various human-controlled or unmanned platforms, such as aircraft or helicopters, land vehicles, marine vessels, strollers, and those hand-carried by individuals. Therefore, mobile mapping systems certainly play a crucial role in urban informatics applications since timely and accurate geospatial data are the key ingredient in implementing the digital infrastructure serving the backbone of urban informatics. Figure 25.16 depicts an indoor mapping scenario to build a floorplan with a robot and indoor UAV, respectively, where the 3D positioning accuracy achieved was around 1–1.5 m based on the scenario.

Fig. 25.16
figure 16

An example of unmanned mobile mapping technology

Ultimately, the future technological trends in mobile mapping that will advance urban informatics applications can be characterized by (1) fulfilling seamless mapping scenarios; (2) increasing use of low-cost direct georeferencing devices; (3) increasing use with artificial intelligence; and (4) increasing use with unmanned multi-platforms for collaborative mapping.

5 Conclusion

This chapter has comprehensively discussed mobile mapping technologies. From the labor-consuming indirect georeferencing to the efficient DG, it is clear that evolution has been rapid and that researchers have contributed to the development of this technology. Nowadays, this technology also plays an important role in future applications, such as autonomous driving and rapid disaster response. In other words, accurate geospatial data become one of the game-changers in the future. It is worth mentioning that the individual components of mobile mapping technologies take part in every geospatial technology for data acquisition, such as computer vision, simultaneous localization and mapping (SLAM), and robotic mapping. In the foreseeable future, we are likely to see the ever-increasing importance of mobile mapping technologies.