Keywords

1 Introduction

Augmented reality (AR) is an emerging technology that has the potential to transform the way we interact about the built environment during emergency response for decision making. Traditionally, 2D maps are used in the building to show users where they are and the available exits in the building. The 2D maps help the patrons form a mental representation of the building for emergency response. We hope that our research will aid in inspiring future applications of AR in emergency management to replace 2D maps with 3D visualization with a better understanding of space, mitigate risk, and improve public safety. AR could be used to add spatial context in the building that will allow users to better understand their position within it. We propose that emergency response and emergency preparedness in a building is influenced by the presence of evacuation maps, the ability to display, interact with it, a cognitive mapping of a built environment, and explore the information using AR. We have introduced a series of permanent assets in the buildings (such as room numbers, boards, displays as markers) to enhance spatial perception, and situational awareness of multilevel spaces through proprioceptive affordances of situated AR evacuation displays. The permanent assets are used for finding the current location of the user if GPS and wireless are not available in emergencies. They trigger the visualization of 3D floor plans with layered situational data.

Figure 1 shows the floor plan from the HoloLens using existing 2D evacuation plans and room numbers in the buildings as markers to project the 3D floor plan. The 3D floor plan also shows the current and previous locations the user has navigated by offering enriched experience in navigating large-scale environments. Our proposed AR application was developed in Unity 3D for Microsoft HoloLens. It is a fast and robust marker detection technique inspired by the use of Vuforia AR library. The application offers users an enhanced evacuation experience by offering enthralling visuals, helping occupants learn the evacuation path they could use during an emergency situation where evacuation is necessary. Our aim is to enhance the evacuation process by ensuring that building patrons know all of the building exits and how to get to them. This would improve evacuation time and eradicate injuries and fatalities which occur during indoor crises such as fires and active shooter events. We have incorporated existing features in the building as markers for the HoloLens application to trigger the floor plan and subsequent location of the person in the building. This work also describes the system architecture as well as the design and implementation of this AR application to leverage the Microsoft HoloLens for building evacuation purposes. Pilot studies were conducted with the system showing its partial success and demonstrated the effectiveness of the application in an emergency evacuation. Our results also indicate that the majority of participants felt that HoloLens application can be used as a substitute for evacuation plans (2D plan) in a building. Usually, the evacuation plans are displayed as 2D plans in the buildings. Sometimes it can difficult for users to visualize a building through a 2D plan. The use of AR application gives the user the flexibility and ability to visualize the building and exits in a 3D space.

Fig. 1.
figure 1

View of the floor-plan from HoloLens using existing 2D plans and room numbers as markers

Emergency response and decision making are very important in fire safety studies as well as during active shooter events. Causalities can be reduced if occupants are successfully evacuated as soon as the emergency begins. The reasons for the failure of timely evacuation by occupants in emergency situations could be a result of improper layout of the building structure or the occupants make unreasonable choices due to unfamiliarity with the building. Evacuation training and evacuation drills are conducted in real buildings that have disadvantages such as high cost, limited response time, and failure to simulate real emergencies. The rapid rise of AR technology has made it possible to overcome these disadvantages by using AR methodologies [1,2,3] and VR methodologies [4, 5]. With VR, the safety professionals and the occupants of the buildings immerse themselves in the virtual building with virtual fire and smoke. Sharma et al. [6,7,8,9] have conducted virtual evacuation drills in immersive VR environments with fire and smoke. With AR, safety professionals and occupants of the building get a 3D, holographic view of building floor plans, a better perspective of the building, making it easier for them to find a way out of the building during the evacuation. Our past results and pilot studies have indicated that majority of participants felt that HoloLens application [10, 11] and AR mobile applications [12, 13] can be used as a substitute for evacuation plans (2D plan) in a building.

The objective of this paper is to present the research and development behind a series of mixed reality (MR) 3D visualization systems for communicating emergency evacuation plans within multilevel spaces using Microsoft HoloLens, tablets, and phones. The rest of the paper is organized as follows: Sect. 2 discusses previous work related to this paper; Sect. 3 describes the system architecture of the proposed application; Sect. 4 focuses on how the implementation and deployment; Sect. 5 discusses the results of the pilot study; Sect. 6 discusses the conclusions.

2 Related Work

The most common 3D perspective carrier’s headsets are in Virtual reality (VR), augmented reality (AR), and mixed reality (MR). VR, AR, and MR enhance the user’s sense of presence in the environment but they differ from each other. VR immerses the user in a digital space and cuts off the physical world [14]. AR presents an overlay of digital content over the physical word [15] to the user. On the other hand, MR adopts the advantages of both AR and VR by blending reality and virtuality. MR is able to transform the physical world into the virtual world in real-time [16]. Unlike AR, MR allows the user to experience depth spatial persistence, and perspective [17]. MR head-mounted display (Microsoft HoloLens) has been utilized as an example to show how to express and visualize 3D geographic information from a 3D perspective [18]. HoloLens contains stereoscopic 3D displays, gaze design, gesture design, spatial sound design, and spatial mapping [19]. Fruend et al. [20] have built an AR tool to train assembly line workers on new and complex assembly processes.

Mobile augmented reality (MAR) allows the users to effectively learn, ubiquitously, the same concepts that can be taught in the classroom [20]. Emergency evacuation is important for building evacuations in an indoor setting due to the need for safe evacuation and safety considerations. There have been several works that involve the development of Mobile augmented reality application (MARA) using the Android Software Development Kit. Meda et al. [21] have built a MARA that translates English text into another language (Telugu). In this Android-based application, the user takes a picture of English text and saves it as a jpg image file. Parhizkar et al. [22] have designed and developed an Android-based MARA to teach students general science. On the other hand, Sabri [23] have described and evaluated MARA [24] to teach users cultural heritage. Iguchi et al. [23] have implemented a MARA that features virtual children and trains adult users on how to direct children during a real-life evacuation.

3 System Architecture

The objective of this research is to demonstrate the application of MR interfaces that can provide spatial contextualized 3D visualization that promotes knowledge acquisition and support cognitive mapping within the realm of emergency management. This research builds upon the authors’ previous emergency evacuation research [6,7,8,9] exploring game-engine based evacuation and HoloLens [10, 11] for evacuation in real and virtual spaces. Figure 2 shows the system architecture diagram of our built application.

Fig. 2.
figure 2

System architecture diagram.

3.1 3D Assets

The AR visualizations presented in this paper focus on the visualization of 3D building models, text, and annotations using Sketch up, Maya, and 3Ds Max. The initial step for this process was to create a 3D model of each floor plan for the building. SketchUp was initially used to build the 3D assets such as walls, windows, furniture, etc. Both the 3D building and the evacuation pathways were exported as 3D Object files (.obj). The 3D model of the campus was built using 3Ds Max and then exported to FBX format. This format is acceptable by all 3D software and gaming engines. One of the main challenges of this project was to identify the user’s location. During emergency situations, GPS capabilities and wifi technology might fail or not work adequately. To overcome this challenge, we incorporated existing permanent objects (images) in the physical surroundings. For example, existing 2D room numbers nameplate in the building were used for location detection as well as for superimposing 3D floor plan on top of it. Figure 3 shows the existing signboards of the room numbers in the building.

Fig. 3.
figure 3

Existing signboards of different room numbers in the building (used as markers)

3.2 AR Development

The development of the mobile augmented reality (MAR) application was done using Unity 3D. The prototypes developed provided an example of image-based AR (or marker-based AR), which uses computer vision software to identify pre-defined images, subsequently rendering virtual objects according to the position and orientation of those images in real space. The development involved C# scripting, animation, 3D assets, image, texture creation, mapping, and user gaze. In this phase, the unity project is created and all necessary assets and data files are loaded into the project. In unity environment rest of the project is developed, such as inserting buttons assigning C# scripts to the objects to get effects.

3.3 AR Mobile Deployment

The 3D visualizations were developed for Android mobile devices and could be reconfigured for Apple or Windows-based devices. These prototypes were tested on two Samsung Galaxy mobile phones (S9 and Note 9). These mobile devices are typical of the compact mobile technology available in today’s life. The prototypes were also tested in the HoloLens and Samsung tablet. Deploying from unity to HoloLens lead to challenges like connection issues, packets missing etc. To overcome these issues, we generated APK file from unity which can be accessible by visual studio. Later, we connected the computer and HoloLens with a USB cable. By choosing HoloLens as a targeted device in visual studio we deployed the AR application into HoloLens. After successfully deploying the project into HoloLens, the user can use the application in the HoloLens by opening the application.

4 Implementation and Deployment

The aim of this paper is to give a detailed view of the project, how HoloLens can be used for building evacuation purpose and for internal navigation. The whole lifecycle of the project can be divided into four phases.

4.1 Phase 1

During the initial phase, the aim was to build the 3D model of the floor plans in the building. By using a 2D floor plan as a reference, the 3D model was designed in sketch up. After completing the model designing, the furniture and 3D assets were added to the model. The 3D modeling of the floor plans was done using Google sketch up and 3DS Max. The file was converted into FBX format in which unity can accept. FBX enables wide options to exchange 3D data between different applications. This FBX file was imported into the unity project.

4.2 Phase 2

In this phase, Vuforia was loaded into unity. As we discussed earlier, HoloLens is not equipped with GPS. This problem was overcome by using a feature extraction through the use of Vuforia. The permanent features such as room number signs and evacuation plan boards in the building were used as targeted images or markers. Thus, markers are nothing but existing 2D assets or images in the environment. All these images (refer Fig. 3) were collected and loaded into the Vuforia database. After successfully importing Vuforia assets into the project, Vuforia data database was also imported into the unity project. This was done by dragging that file into the assets folder in the unity project. When the unity project was created, the scene was also populated with light and camera. Later the camera was replaced with the AR camera. If the Vuforia asserts are loaded properly one can find the AR camera under Vuforia options in unity. This camera needs a license key, which one can get from the Vuforia account. The license key allows the AR camera to become active and give full access to unity.

4.3 Phase 3

In this phase, scripts were written using C# language for finding user location by feature extraction method. As mentioned earlier, all the 2D targeted images were collected and loaded into the Vuforia database. Now by using the feature image targeted method, the current location of the user was extracted and the 3D marker was placed in the 3D. This was done by assigning C# code to the respective targeted image. This process was repeated for all targeted images in the database. Thus, it allowed us to populate the application with the current location and the respective path to the nearest exits in the floor plan. The arrows were assigned on all floor plans from the user’s current location by writing a script using the C# programming language. The room numbers and description of the rooms were added on top of the 3D floor plan. These room numbers can be toggled on and off through the button on the GUI. These descriptions of the rooms are embedded into floor objects. So that, when the floor plan has loaded the details of the floor also popup. Later buttons were created on the GUI for controlling arrows displays and floor plan descriptions. Later fire and smoke were also added in the floor plan that can be toggled on or off. If the user would like to use the application for indoor navigation, fire and smoke can be toggled off. The color of the user location cubes from red to yellow. When the user faces the targeted image, the location cube is red. But, when the user moves to a new location current location, the previous cube becomes yellow and the new current location cube shows as red.

4.4 Phase 4

In this phase application deploying was done HoloLens, tablets, and phones. During the deployment phase, the unity application was converted into visual studio executable format. The converted code was opened in visual studio. Next, HoloLens was connected to the system for deployment. After deploying the application successfully, a new application icon appears in the HoloLens menu. Figure 4 shows the deploying steps. By tapping the new icon in HoloLens launches the MARA application. When the MARA application is launched it starts scanning the surroundings for targeted images. When the targeted image is in front of the camera, then the application pulls the respected floor plan and current location cube. If the user selects the floor info button on the GUI, then the application displays the room numbers text/description on the floor plan.

Fig. 4.
figure 4

The life cycle of the project.

5 Simulation and Results

The AR interface presented here is a custom-designed mobile augmented reality application (MARA) that augments the evacuation plan with additional 3D geospatial data. It was developed with Unity and the AR SDK from Vuforia, and was installed on an author’s smartphone (Galaxy Note 9 and S9) using the Android and visual studio development kits. A photograph of the evacuation plan was converted into an AR image-target and used as a marker. AR content is displayed on the mobile device when the user points the device camera at the evacuation plan posted on the wall (Fig. 5). The user can also enable different layers for text and annotations. Figure 5 also shows the path to the nearest exit by the use of green arrows.

Fig. 5.
figure 5

The 2D evacuation plans and room number signs posted within a building serve as markers to the MARA (Color figure online)

The posted 2D evacuation plans in the building (refer Fig. 5) provide a quick overview of the evacuation plan and a limited amount of spatial information. These 2D floor plans are designed as a quick reference to help occupants of the building who are not familiar with the building or are lost during the evacuation. Without spatial context, it becomes difficult for the occupants to know where exits are and where the direct route to safety is. Figure 5 provides a visualization of 3D model of the building in MARA. The user can rotate the model, adjust its scale, shows the shortest path to the exit, show current location, enable text and annotations for room numbers through the use of buttons on the graphical user interface (GUI). The added layers of information help the user to get a better mental representation of the building and provides visual perspectives to elicit cognitive connection between data and space. The MARA allows the user to locate the exits, evacuations paths to the exit in 3D and improves their ability to evacuate the building and find safety.

A limited user study was conducted to evaluate the effectiveness of the proposed HoloLens application, mobile application, and tablet application. The study illustrates its partial success and demonstrates the effectiveness of the application in emergency response for the building evacuation. The responses were collected from 10 participants, 8 male and 2 female. The post-study questionnaire measured participant’s perceptions of motivation, usability, educational and training effectiveness, of the AR applications (HoloLens, Mobile phone, and Tablet) appropriateness.

Figure 6 shows that the majority of the users (60%) felt that HoloLens was more suitable for evacuation than with a tablet or mobile phone. The following questions were asked in the post-study questionnaire for the user study:

Fig. 6.
figure 6

Device suitability for the study.

  • Do you consider this system useful in unknown buildings with a complex structure?

  • Will viewing this HoloLens App help during the real-time evacuation

  • Substitute for evacuation plans (2D plan) in a building

  • Used for educational or training purposes in evacuation

The HoloLens application received more positive answers regarding usability as shown in Fig. 7. All of the participants (100%) felt that this system will be useful in unknown buildings with a complex structure while 90% of the participants felt that viewing this HoloLens app will help during a real-time evacuation.

Fig. 7.
figure 7

Device suitability for the study.

6 Conclusions

This paper presented the research and development of a series of prototype AR visualizations for emergency evacuation using HoloLens, tablets, and mobile phones. This work highlights the ability to represent complex multilevel spaces in an inherent 3D form using AR technology. Our proposed AR applications give a visual representation of a building in 3D space, allowing people to see where exits are in the building. It also gives a path to the various exits; a shortest path to the exits as well as directions to a safe zone from their current position. We have described the system architecture and implementation of the AR application to leverage the Microsoft HoloLens for emergency response during a building evacuation. We have introduced a series of layers in the visualization to show text and annotations for each floor to enhance spatial perception and situational awareness of multilevel spaces. We have demonstrated how AR tools can support improved emergency preparedness through the use of HoloLens.

The existing posted 2D evacuation plan in the building indicates the location of the exits that are in close proximity to the viewer, but it does not provide the spatial context which would allow the viewer to make an informed decision about the safety of those exits. When augmented with 3D models, the viewer is provided with that additional context as shown in Fig. 8. The spatial context is provided through the MARA for multi-level buildings. Thus, the user is able to formulate new spatial knowledge concerning their location in multilevel space and its relation to safely evacuate the building in emergencies. AR technology has the potential to transform the way we interact with the information in the built environment. The 2D maps in the buildings are traditional ways in showing the occupants where they are and where the exits are in the building. They do not give a mental representation to the users to visualize the building. However, with the use of AR technology, the users can use the existing 2D maps as markers for an enriched view of the building in 3D space with layered spatial information. A user study was conducted for the AR applications for HoloLens, tablets, and mobile phones. The experimental results show the effectiveness of our AR applications for emergency evacuation. All of the participants felt that this system would be useful in unknown buildings with a complex structure while 90% of the participants felt that viewing this HoloLens app will help them during a real-time evacuation.

Fig. 8.
figure 8

Contextualizing evacuation pathways through the use of arrows