The Digital Earth Viewer is a web application for spatiotemporal contextualization and visualization of heterogeneous data sources. It was developed with the goal of enabling real-time exploration of geoscientific data sets across spatial and temporal scales. To this end, it is capable of ingesting data from a large variety of types that are usually found in the geosciences, and it deploys a user interface, which allows for interactive visual analysis. At the same time, online and offline deployment, cross-platform implementation, and a comprehensive graphical user interface are all capabilities that make the Digital Earth Viewer particularly accessible to scientific users.
3.3.2 Visualization Concept
This infrastructure provides a framework in which new visualizations for heterogeneous data set can be created with relative ease. Since no reduction in dimensionality is undertaken and the four-dimensional data (space and time) is displayed in a four-dimensional context, temporal or spatial distortions are mostly avoided; this leads to an improved interpretation capability and supports the understanding and contextualizing of information in an intuitive process. The Digital Earth Viewer enables the user to visualize assorted geoscientific data in three spatial dimensions and across time. It projects the spatial coordinates latitude, longitude, and altitude into a virtual globe and builds an ordered registry of temporal events. Both of these features can be accessed through user interface elements in real time.
Different data sources can be added simultaneously to form individual layers, each with its own data basis and transformation pipeline. Global parameters, such as the position in time and space or the intrinsic scale of the visualization, are implicitly and explicitly communicated to the user in the graphical interface while specialized parameters can be set through a menu. The transformations for each data source happen independent from one another and are composed together into one final result, allowing the blending of multiple data sources.
Data is grouped into several different categories for display. Traditional 2D maps can be projected onto a spherical surface and displacement along the sphere’s normal vector can be applied. Scalar values are mapped onto one of a set of color maps, while precolored maps are passed through. Sparse data can be displayed as a point cloud which is projected, colored, and culled according to the global projection parameters. For the intuitive representation of vector fields, an animated particle system is created in which the particles follow the vector field which is projected onto the virtual globe.
184.108.40.206 Technical Implementation
The tool is a hybrid application, which is split into a server back-end and a client front-end. The rust (https://www.rust-lang.org/) back-end handles data extraction from different file formats as well as re-gridding into viewable areas and caching. It can be hosted remotely on a server or locally for offline access. The front-end consists of an HTML interface component and is responsible for the 3D data rendering using the WebGL API (https://www.khronos.org/webgl/) and the implementation of the graphic user interface controls which uses Vue.js (https://vuejs.org/).
For each data type that the server needs to ingest, a specific data type is built that transforms the incoming data format into an internal representation optimized for computation operations. This representation is then passed over to the client, which applies graphical transformations to compute a visual representation of it.
A running instance of the application can be accessed under the following web address: https://digitalearthviewer.geomar.de.
The Digital Earth Viewer is an open-source software licensed under the EUPL (https://joinup.ec.europa.eu/collection/eupl/eupl-text-eupl-12).
220.127.116.11 Methane Budget in the North Sea
The Digital Earth showcase “Methane Budget in the North Sea” set up to build a methane gas budgeting for the North Sea region. The showcase makes use of the Digital Earth Viewer to unify a large number of data sets under a single visualization interface. Boreholes from fossil fuel production are known to be important sources of methane that is released into the atmosphere. The GEBCO (https://www.gebco.net) bathymetry from the North Sea region is displayed in a 3D elevation model. The aerosol and trace gases dispersion model ICON-ART (Rieger et al. 2015) is used to calculate the contribution of these boreholes to the atmospheric methane concentration. The viewer’s interface allows to quickly compare the resulting data product with existing methane estimates from the EDGAR (https://data.jrc.ec.europa.eu/collection/edgar) emissions database and provides a visual assessment of their accuracy. In a similar way, measurements of geochemical water properties from the expedition POS 526 (https://oceanrep.geomar.de/47036/) are displayed in spatial context of other measurement compilations from the Pangea (https://www.pangaea.de) and MEMENTO (Bange and Bell 2009) databases. The observation of their development over time is further supported by the visualization of three-dimensional physical water properties like current velocities and pycnocline depth obtained from the NEMO Model (Madec 2016). An instance of the Digital Earth Viewer displaying the Methane showcase can be found under following web address: https://digitalearthviewer-methane.geomar.de.
Added value: Using the Digital Earth Viewer, scientists can simultaneously access and visualize data from all the sources mentioned above. Seamless spatial navigation allows them to directly compare the global impact that regional methane sources have in the atmosphere, while temporal components enable them to do so across the different seasons of an entire year (Fig. 3.2).
18.104.22.168 Explorable 4D Visualization of Marine Data
The expedition Mining Impact-II started a series of experiments to answer some of the most important questions regarding profitability and sustainable exploitation of resources in deep sea mining, such as mining for manganese nodules. Deep sea exploration is a challenging endeavor that can be greatly aided by the use of modern visualization techniques. In this work, we aim to recreate a sediment plume which resulted from an underwater dredging experiment. This will help to quantify similar sediment depositions from mining that could impact deep sea ecosystems at a depth of 4,000 m. Sensor data fusion allows for the virtual exploration of experiments on the seafloor; the following is an overview of the different data sources acquired during the expedition that come together within one visualization:
Turbidity sensors calculate this optical property of water by measuring the scattered light that results from illuminating the water.
Current sensors use the Doppler effect to measure the velocity of particles suspended in the water column and thus calculate the speed and direction of water currents.
Multi-beam echosounders emit fan-shaped sound waves and use time of flight to reconstruct the seafloor bathymetry.
For the dredging experiment, an array of 8 turbidity sensors and 8 current sensors was placed on the sea floor in an area previously scanned with a high-resolution multi-beam echosounder mounted beneath an autonomous underwater vehicle. Moreover, a virtual sediment plume was modeled and integrated into the experiment. The spatiotemporal contextualization of all data sources took place allowing for a real-time simultaneous analysis of heterogeneous data sources in 3D and across time. An instance of the Digital Earth Viewer displaying the Sediment Plume showcase can be found under following web address: https://digitalearthviewer-plume.geomar.de.
Added value: Using the Digital Earth Viewer, the experiment grounds were recreated. This virtual environment allowed scientists to peer beyond the darkness of the deep sea and explore the impact of a simulated mining endeavor. The numerical model of the sediment plume was superimposed and compared to the in situ data obtained by the turbidity and current sensors resulting in a visual confirmation of the sensor placement and model correctness. Deployment as a web-based application accounted for cross-platform portability across devices and operating systems, allowing scientists to visually explore the phenomena that take place in this virtual abyssal plain, and to share their discoveries with a wider audience (Fig. 3.3).