1 Introduction

The purpose of this study is to extract a standard movement code according to specific emotional behavior patterns in the human behavior information database and to develop educational applications composed of corresponding opposite behavior codes. A standard movement code is the basic unit of the behavior classification system used for motion analysis and recording. This type of data includes elements, such as cognitive, emotional, and non-categorical movement in the ‘movement’ as ‘nonverbal behavior’ and is critical for understanding interactive information. Therefore, a standard movement code is the basis for building a source technology for the communication between human and machine and vice-versa in our digital environment. In fourth industrial revolution it is essential to construct a service platform environment through the Internet of things, artificial intelligence, and big data, to improve quality of life.

In this study, we used Labanotation software (LabanWriter) to analyze behavioral data which is commercialized in the field of motion research. We coded common behavioral characteristics from the notation data and designed interventional education contents by constructing opposite behavior code. In addition, among the various behaviors of the human being, we focused on the behavior of the emotional behavioral disorder group and accumulated the motion data and extracted the specific behavior motion code from it.

2 Behavioral Data Collection

The contents for the emotional behavioral items of the AMPQ-II (Adolescent Mental Health and Problem Behavior Screening Questionnaire-II) (Jung et al. 2008), were developed and verified by Korean behavioral experts. The emotional behavior scenarios were constructed to induce basic movements such as hand, clapping, arm movement, walking and running on the spot (see Table 2).

As a main group, 3 middle and high school students with emotional and behavioral disabilities (mood, depression, ADHD) who were accompanied by medication and adjunctive therapy were recommended from the doctor. As a control group, data from 7 university students majoring in drama and film were recorded. We extracted behavioral data by using motion capture and Labanotation recording, and extracted emotional behavioral characteristic movement codes through comparative analysis of interpersonal behavior data codes.

To analyze the behavior of the subject, we recorded their bodily movement with Labanotation and motion capture technique. The 50 markers were attached to the subject’s head and body, and 3 Kinect cameras and 12 OptiTracks were installed. Three observers recorded object’s behavioral observations at the experimental site, and after the experiment was completed, the Kinect video were recorded as LabanWriterFootnote 1 by the same observers. The notation information recorded by the LabanWriter of the patient and the normal group.

We selected one from each of the two groups for example. Their notations were compared with each other. Among the 7 items in Table 2, one example of comparison and derive the movement code from the notation data of 1-B as shown in Table 1. In AMPQ-II, the 1-B problem is related to the “Learning and Internet” question: “Do not concentrate when you need to focus and do not do anything else.” And it was reconstructed as an interactive content that “give various shapes and applause when the stars are shiny”. The behavior notation induced by the video was based on the Kinect video taken from the pilot. Notation is recorded in 1 s increments, and the notation results are summarized as Table 1 for parallel comparison of data. The interpretation of Labanotation data can be read from the bottom of the left line to the top and read to the right.

Table 1. An example of encoding and decoding data comparison

3 Educational Application with Opposite Behavioral Code

The subjects’ responses to the 1-B contents were appropriately met with applause from both the normal group and the patient group, as in the notation recording. There is also a significant difference between the two groups. In Table 1(a), the normal group did not applaud, whereas the disability group showed the wrong reaction. In (b), the normal group responds clearly and distinctly when applauding, whereas the patient group cannot clap properly and cannot judge and timed when to applaud. (c), the normal group still applauds clearly, while the patient group can read the hesitant pattern of narrowing the gap between the reaction and the reaction and not clapping properly. (d) showed that the normal group had more detailed and clear behaviors, whereas the patient group had three sections of behavioral response, but the number of microscopic applause and execution was negatively occurring. Therefore, emotion behavior movement codes such as wrong reaction, mistiming, hesitation, and passive are extracted from behavior of disability group by comparison analysis of notation data of normal group and disability group.

In order to mediate the emotional behavior problems revealed in the notation, we designed action scenarios composed of opposite codes. Inverse codes of wrong reaction, mistiming, hesitation and passive should be composed of behaviors contrary to AMPQ-II’s question content while including these behavioral characteristics with proper reaction, on-timing, immediate, simple and sharp. Based on these criteria, the scenario for each question was designed as follows.

  • put the correct shape or color in the basket

  • make shapes together

  • make symbol of elements with own body

  • exit the maze without stepping on the bottom line

  • matchmaking, brick mining

  • pausing and moving play

  • mirroring, dance with myself

As shown in Table 2, the proposed application is a kind of educational behavior intervention program that expects emotional behavior intervention effect when applied for more than 12 weeks.

Table 2. Behavior-based application from selected questions of AMPQ-II and educational application designed with movement codes

4 Future Study

The debate on the intervention program is now being further intensified, and the next pilot will accumulate action data by expanding the total number of subjects and the disabled population. Furthermore, by analyzing the data through the machine learning technique, we want to construct behavior classification system and standard motion code of the emotionally disabled children. It is our hope that this standard movement code and content research based on emotional behavior will be extended for general ICT use. It is important to develop an ICT system that can be applied for everyday life (emotional communication digital environment), engineering (motion recognition based health care), performing arts (mixed reality realization performance culture), sports science (intelligent motion analysis system), education (emotional communication ubiquitous education) and design (customized environment design).