Introduction

The advances in digital devices such as computers and smartphones have also led to the widespread use of input–output devices such as keyboards, pointing devices, and touchscreens. Such input–output devices offer rapid response times, convenience, and ease of use [1]. Specifically, the use of touch-based input–output devices has been a rapid surge because of its simple and easy operation by finger touches and hand gestures. The onset of the era of social distancing due to the COVID-19 pandemic further accelerated the utilization rates of various digital devices [2, 3]. This led to the increased exposure of children to a multitude of digital devices from their early years of development [4, 5]. Nonetheless, infants and young children who are not yet capable of reading have not developed the visual acuity required to perceive rapid movements within the screen [6]. Therefore, such adult-centric input–output devices are physically unsuitable during the early childhood stage, which is a critical period for language cognition, visual perception, and sensory development. Moreover, digital devices operated by simple hand gestures are primarily designed for adult convenience, making them unsuitable for use in the early childhood phase when significant development of gross and fine motor skills, essential for tasks like object manipulation, is crucial. Therefore, digital input–output devices tailored for adults can have negative implications for the physical development of young children [7]. Furthermore, adult-centric input–output devices may adversely affect the neural development of young children. Repeated exposure to digital devices blurs the distinction between the real and virtual world for young children. Therefore, psychological assessments for young children are necessary, despite being influenced by adult-centric approaches. Conventional methods for assessing mental health involve expert consultations, surveys, and brain imaging within mouse and keyboard-based environments [8]. Unfortunately, these methods are designed for adults and require a minimum level of physical and intellectual capabilities, which can result in subjectivity and bias, making it challenging to accurately diagnose mental health conditions in children without specialized expertise [9]. Therefore, the development of a young children-focused input–output system is imperative.

Because of limitations in language communication, young children express their emotions and intentions through actions, such as shaking or throwing objects, pressing objects, and mouthing behaviors. For instance, in the case of neurodevelopmental disorders such as autism, diagnosis can be predicated upon the child’s behavior. Seemingly purposeless rotations of objects or repetitive actions may appear trivial in infants but are indicative of early autism symptoms in toddlers [10]. There is a need to develop a method to understand the intent behind such expressions of young children or behavioral patterns that may be indicative of early signs of a disease. Contemporary techniques for analyzing the behavioral patterns of children employ visual information such as segmentation, object detection, and classification technologies involving aspects like positioning and movement patterns [11]. Nevertheless, these conventional methods rely on videos recorded from a third-party perspective, limiting their capability to discern only the gross behaviors of young children. Information regarding their subtle actions and the pattern within these actions remains elusive. Consequently, to capture fine-grained behaviors, there is a need to design sensors capable of detecting various behaviors and to utilize a platform that can serve as a child-friendly intermediary in the form of familiar toys. This platform can help integrate task-based educational programs and collect data on the behavior, emotions, and intentions of young children. This data can potentially facilitate early diagnosis of physical, neurological, and mental health disorders in young children.

With the advancement of sensor technology, our research aims to integrate lightweight and compact physical sensors into digital interfaces accessible to young children. The goal of our research is to develop a sensor-based user interface (SUI) capable of collecting and analyzing various data related to specific behaviors in young children who may be challenging to analyze through direct communication. In the suggested SUI, we integrate commercial accelerometer/gyroscope sensors into toy blocks to detect physical movements and include pressure and humidity sensors to detect fine-grained behaviors, such as finger pressing or trembling related to emotional behaviors. This may facilitate the understanding of the behaviors of young children, and therefore, we can estimate emotions and intentions of young children. The data from these sensors can be potentially processed in real-time using artificial intelligence (AI) algorithms. Also, this platform may enable the development of educational content focusing on cognitive and play development through specific tasks. By integrating input–output devices, we propose a comprehensive system to analyze young children’s cognitive and play behaviors. The proposed SUI is expected to serve as an indirect communication tool benefiting young children as well as those with communication challenges, including the hearing-impaired, elderly, and disabled.

Results and discussion

Sense-based user interface (SUI) configuration

The sensor-based user interface (SUI) is comprised of an inertial measurement unit (IMU), pressure sensors, and humidity sensors to measure fine-grained behavior as well as global behavior. The platform is in the form of a cuboidal block model, with IMU and circuits embedded within the block. Multilayered pressure and humidity sensor arrays are installed on the external surface of the block. In this configuration, the humidity sensors, operating by direct contact with external water molecules, are positioned on the outermost layer of the block, and the pressure sensors are installed beneath them (Fig. 1). To detect actions involving pressing/touching with hand or mouth, the pressure and humidity sensors were fabricated using a printed circuit board (PCB) printer (V-one, Voltera). For global motion detection, an IMU was used without the need for a separate flexible sensor, as it is unrelated to direct contact such as pressing with the mouth or applying pressure. In this study, a commercial Arduino Nano BLE (Bluetooth low energy) board (Nano 33 BLE, Arduino) with built-in accelerometers and gyroscopes was employed to capture global motion and impacts of objects.

Fig. 1
figure 1

Sense-based user interface (SUI) for behavioral analysis. Input behavior parameters and the design framework of the sensor-integrated platform

Pressure and humidity sensors and sensor array

Pressure sensors can be broadly categorized into two types based on their pressure detection mechanisms: capacitive and piezoresistive. In this study, we opted for the piezoresistive method due to its advantages, including high sensitivity at low pressures and ease of fabrication [12]. Piezoresistive pressure sensors operate on the principle of converting changes in external pressure into electrical signals by varying the resistance of a pressure-sensitive layer within the sensor [13]. Commercial Velostat is utilized as a pressure-sensitive layer (Velostat 1361, Adafruit). Velostat is a pressure-sensitive material that imparts electrical conductivity by incorporating carbon black into a polymer layer [14, 15]. It is lightweight, flexible, and cost-effective, making it suitable for a wide range of applications. The pressure intensity is measured by exploiting the principle where the distance between conductive particles within the Velostat film decreases due to applied pressure or bending, leading to an increase in the number of conductive pathways and a subsequent reduction in the film’s resistance [13]. A single pressure sensor has a sandwich structure consisting of two conductive electrodes printed on a polyimide (PI) film using a PCB printer, and the pressure-sensitive Velostat film positioned in between [16]. To evaluate the fabricated single-pressure sensor, we applied force on the single sensor and measured resistance values using both a source meter (2400 standard, Keithely) and an Arduino. At maximum force applied to the single sensor, the source meter measured a resistance of approximately 250 ohms, prompting us to set a reference resistance of 10 kohms. When no force is applied, the resistance measurement approaches infinity, resulting in an Arduino reading approximating 0 V. As force is applied, the resistance of the sensor decreases, resulting in a voltage reading approximating the supplied voltage of 5 V. Through the use of voltage division principles, we reverse-calculated the sensor’s resistance based on this voltage and subsequently calculated the input pressure. This calculated resistance value was then utilized to output the pressure data via the serial monitor. Sequentially applying force ranging from 7 to 55 kPa to the single pressure sensor, we constructed a graph based on the average resistance values output by the sensor corresponding to each weight. The graph depicted in Fig. 2a showed that the fabricated single sensor exhibited a trend of exponentially decreasing resistance with increasing pressure.

Fig. 2
figure 2

Single Flexible resistive pressure sensor and capacitive humidity sensor. a Single resistive pressure sensor and pressure-resistance correlation; b single capacitive humidity sensor and humidity-capacitance correlation

Humidity sensors can be categorized into two types based on their measurement methods: capacitive and resistive. The capacitive method utilizes tiny water droplets attached to a substrate, causing a change in the dielectric constant and, consequently, a variation in capacitance [17, 18]. It measures the humidity by detecting an increase or decrease in capacitance. The advantages of this approach include a high linearity, less sensitivity to external temperature, and the ability to measure low humidity [19, 20]. In this study, we adopted the capacitive method to fabricate a humidity sensor. The conventional parallel capacitor design [21], which uses a capacitive structure for humidity sensors, is unsuitable for humidity measurement because both the top and bottom are enclosed by plates and dielectric material, leaving no space for the permeation of water droplets. Therefore, in this study, we designed a humidity sensor with an interdigitated structure, as shown in Fig. 2b. Furthermore, unlike the traditional interdigitated electrode (IDE) structure, for sensor integration purposes, we printed the capacitor electrodes on both sides, creating a humidity sensor. For these reasons, although the traditional IDE structure has the drawback of an additional serial capacitor, leading to a decrease in initial capacitance, we adopted this design approach due to its advantages in terms of integration and the potential for sensor array expansion. Then, we characterize the fabricated single humidity sensor using both an LCR meter and an Arduino. To evaluate the functionality of the completed single humidity sensor, we utilized a constant temperature and humidity chamber (TH3-ME, JEIO tech) to gradually increase humidity over approximately 15 min. We measured the changes in capacitance and verified that multiple sensors manufactured under the same conditions exhibited similar ranges of capacitance values. These results served as the basis for constructing the graph depicted in Fig. 2b. The graph shows that the fabricated sensors show an increase in capacitance with rising humidity levels, indicating the correct functioning of the humidity sensor.

To increase the measurement area of the designed single pressure and humidity sensors, we expanded them into 5 × 5 and 5 × 10 sensor array structures. For the pressure sensor array, we reduced the noise from parasitic capacitance by printing the top-side lines horizontally and the bottom-side lines vertically, avoiding interference between them. For the humidity sensor array, we printed electrodes on both sides of the PI film to promote integration and minimize weight. To reduce interference noise between electrode lines, we printed the top-side lines vertically and the bottom-side lines horizontally. To simultaneously collect signals from the 25 and 50 sensors within the two sensor arrays developed, we utilized a multiplexer (MUX, CD74HC4067, Texas Instruments) and connected flexible flat cables (FFC, AWM 20,624, Fuxell) to each side accordingly. Python’s Heatmap tool was used to visualize the received signal data, as depicted in Fig. 3, allowing us to observe the partial activity of the sensors. The overall array was then evaluated. In Fig. 3a, the regions where resistance decreased upon application of selective force to specific areas of the pressure sensor are highlighted in blue, providing a clear indication of where the force was exerted. Similarly, in the humidity sensor array, as seen in Fig. 3b, placing a wet tissue on a portion of the sensor resulted in a red heatmap, demonstrating that our fabricated pressure and humidity sensor arrays function effectively even when only specific sections are engaged.

Fig. 3
figure 3

Flexible pressure and humidity sensor array. a Fabricated pressure sensor array and performance evaluation with absence of pressure (left) versus pressure application to a specific area (right); b fabricated humidity sensor array with absence of humidity (left) versus the placement of wet tissue on a specific area (right)

System integration for behavioral pattern detection

Figure 4a shows the system integration with the pressure and humidity sensor arrays on the external surface of the block and the IMU into the block. The BLE board with the IMU is installed inside the block and receives rotated angle data (e.g. yaw, pitch, and roll) in real time in accordance with the block’s movement. The pressure sensor array is attached to the block’s surface, and the humidity sensor array is installed on top of it. The MUX and FFC connected to the pressure and humidity sensor arrays are installed inside the block, and real-time data is acquired through Arduino. Based on the integrated block, Python’s heatmap and processing program are used to verify whether the sensors operate appropriately in specific environments. In Fig. 4b, wet tissue is placed on specific areas of the sensor, and pressure is applied on top of it to visually confirm the simultaneous measurement of humidity and pressure in both the affected and unaffected regions. After integrating the sensor, when pressure is applied without using wet tissue, pressure is clearly confirmed, but humidity is observed weakly. Conversely, when using wet tissue without applying pressure, the pressure heatmap does not respond, and only humidity data is output to some extent. The yaw, pitch, and roll data, which is calculated using the “Madgwick filter” from the accelerometer and gyro data output from the BLE board, is received through a processing program to visualize the board’s movement in 3D form. Based on the degree to which the BLE-equipped block is tilted, it outputs the 3D motion of the block created using the yaw, pitch, and roll data from the three axes, as depicted in Fig. 4c. As shown in Fig. 5a, b, behavioral actions ere categorized into fine-grained behavior and global behavior, and visualization was carried out to analyze specific behavioral patterns using the integrated block. First, fine-grained behavior detection using humidity and pressure sensors was divided into two cases: simply pressing the finger and lightly trembling the finger. The humidity and pressure sensor arrays can clearly distinguish when a finger is pressed and when it is not pressed on the completed platform, as confirmed through the heatmap. Global behavior was tested using the IMU based on accelerometer data, divided into shaking and throwing actions. When shaking the block, all three axes of the accelerometer values change, and when the motion stops, a constant acceleration value can be seen. This demonstrates that significant changes in the accelerometer values occur only during the shaking action. When the throwing action is performed, the acceleration value in the direction of the fall shows the most significant change. Repeated throwing actions result in nearly identical patterns of accelerometer data output on all three axes. Using the sensor data from the completed platform, it is expected that different values will be output for each specific action, allowing for the distinction of various behaviors.

Fig. 4
figure 4

Flexible Force and Humidity sensor—IMU integration. a The comprehensive sensor platform and the cross-sectional view of the layered sensor layer. b Simultaneous detection of both pressure and humidity by the integrated sensor. c Visualization of accelerometer and gyroscope measurements from the IMU sensor

Fig. 5
figure 5

Behavior pattern analysis. a Detection of fine-grained behavior with integrated data visualization. b Global behavior detection by IMU accelerometer data

Conclusion

This study developed a digital sensor-based user interface (SUI) for analyzing behavioral patterns, utilizing flexible pressure and humidity sensors, alongside a commercial inertial measurement unit (IMU). A pressure sensor with rapid response, high sensitivity at low pressures, and a straightforward fabrication process was engineered. For humidity sensing, a capacitive sensor with high linearity and minimal sensitivity to external temperature variations was crafted. Each sensor was expanded into arrays to enhance the measurement area and resolution. The device motion was detected using a commercial IMU sensor. Input data from each sensor was preprocessed and then visualized using heatmaps and 3D motion representations. The conventional visual information-based behavioral pattern analysis techniques, based on videos recorded from the perspective of third parties, have limitations in analyzing fine-grained behavioral patterns of subjects due to observer subjectivity. However, integrating three sensors into an SUI platform made it possible to digitize, analyze, and collect subtle behavioral patterns of young children that may go unnoticed by experts or caregivers. This research may potentially lay the foundation for the development of SUI for young children as well as individuals with communication difficulties.