Guest editorial: special issue on big data for effective disaster management (In Memorial of Tao Li)
- 27 Downloads
It is well known that hurricanes, earthquakes, and other natural disasters cause immense physical destruction, loss of life and property around the world. Unfortunately, the frequency and intensity of natural disasters has increased significantly in recent decades, and this trend is expected to continue. Facing these possible and unexpected disasters, disaster management has become a big problem for governments across the world. Recently, however, people’s mobile phone data, GPS trajectories data, location-based online social networking data, surveillance video data, satellite imagery and IC card data have become readily available and this information has increased explosively. The explosion of this sensing data has become “Big Data”, and offer a new way to circumvent the methodological problems of earlier research for more effective disaster management. As such, big data for more effective disaster management is spurring on tremendous amounts of research and development of related technologies and applications.
The goal of this special issue is to provide a premier forum for researchers working on big data for disaster management to present their recent research results. It also provides an important opportunity for multidisciplinary studies connecting data mining and big data analytics to disaster management.
Following an open call for papers, we received a total of 10 submissions for this Special Issue, spanning all topics in big data for effective disaster management. After an initial screening of submissions, all the submitted manuscripts were put forward for review. Each manuscript was reviewed by at least three selected experts in the respective area, based on relevance, novelty, significance, technical quality, and clarity. Following the very competitive and two round review process, we selected 3 papers for final publication. These articles cover a range of important topics related to big data for effective disaster management.
The first paper “Multimodal Deep Learning based on Multiple Correspondence Analysis for Disaster Management”, by Samira Pouyanfar, Yudong Tao, Haiman Tian, Shu-Ching Chen, and Mei-Ling Shyu, proposes a multimedia big data framework based on the advanced deep learning techniques. This study targets content analysis and mining for disaster management and collects a video dataset of natural disaster from YouTube. Then, two separated deep networks including a temporal audio model and a spatio-temporal visual model are presented to analyze the audio-visual modalities in video clips. Lastly, this paper designs a fusion model based on the Multiple Correspondence Analysis (MCA) algorithm to integrate the audio and visual models. The experimental results show the effectiveness of both visual model and fusion model, and reach a high multi-classification accuracy on the challenging dataset.
The second paper “dTexSL: A Dynamic Disaster Textual Storyline Generating Framework”, by Ruifeng Yuan, Qifeng Zhou, and Wubai Zhou, proposes a dynamic disaster storyline generation framework. The proposed framework generates a global storyline to describe the evolution of the disaster events in the high-level layer, and provides condensed information about specific regions affected by the disaster in the local-level layer. The experimental results on typhoons datasets demonstrate the effectiveness of the overall framework in each level.
The last paper “Machine Learning Based Fast Multi-Layer Liquefaction Disaster Assessment”, by Chongke Bi, Bairan Fu, Jian Chen, Yudong Zhao, Lu Yang, Yulin Duan and Yun Shi, proposes a machine learning based multi-layer approach for fast and reliable assessment of liquefaction disaster. Firstly, the simple convolutional neural network (CNN) model is employed to show the most dangerous (liquefaction) areas. In parallel, fast Fourier transform (FFT) is used to transform the surface ground motion (SGM) data from time domain to frequency domain. After that, Light Gradient Boosting Machine (Light GBM) is used to find the dangerous (liquefaction) areas with an improved precision. Based on the proposed approach, the assessment result can be given with high efficiency (few seconds or less) for emergency evacuation in an earthquake.
The guest editorial team of this Special Issue would like to thank all of the authors for submitting their fine work to this Special Issue. Thanks to the hard work of the reviewers who provided their expert reviews under very tight schedules, the quality of the final papers presented in this Special Issue has been greatly improved. Finally, we would like to thank Prof. Marek Rusinkiewicz and Prof. Yanchun Zhang, the Editor-in-Chief, for approving the proposal of this Special Issue and for the tremendous support and guidance they have provided throughout the process.
Memorial of Dr. Tao Li: As the initial Lead Guest Editor of this Special Issue, Dr. Tao Li, a talented Professor of Computer Science at Florida International University, suffered a stroke during a doctoral defense on December 6, 2017. He passed away on December 13, 2017, at the age of 42. During his short but productive life, Professor Li became an internationally renowned expert in data mining and machine learning, with numerous national/international awards, including NSF CAREER Award, Kauffman Professor Award, IBM Scalable Data Analytics Innovation Award, and multiple Mentorship/Service Awards at FIU, etc. He had also supervised 16 doctoral students to complete their doctoral programs, many of whom now are active researchers in their relative fields. His excellence in research, passion in teaching and students’ supervision, and great leadership and personality will always be missed and remembered.