Keywords

1 Cyber-physical Systems for Pedagogical Rehabilitation

The concept of Cyber-Physical Systems (CPS) has been introduced recently to account for technical devices with certain adaptive, sensing and reasoning abilities with a varying degree of autonomous behaviour within networked environments (i.e. internet-of-things) - with or without the human in the information and control loop [1,2,3,4,5]. The CPS approach to education and pedagogical rehabilitation is of special interest for its potential to become a ubiquitous educational tool (also in times of pandemics), as currently developed within the H2020 funded project “CybSPEED: Cyber-physical Systems for Pedagogical Rehabilitation in Special Education” (2017–2022) [6]. Pedagogical rehabilitation is understood by the authors of the paper as a set of behavioural methods for teaching new learning and social skills, resembling games and classroom activities (rather than therapeutic approaches), encompassing the widest range of possible corrections of neurodevelopmental disorders like speech therapy, or focusing on minimal brain dysfunction, delays in acquisition of learning abilities, hyperactivity, attention deficit, etc. in an educational system’s framework.

The currently designed novel educational scenarios with the walking robot BigFoot - patented at IR-BAS [6,7,8,9,10] - from a CPS perspective to pedagogical rehabilitation - are described in the present paper. The non-humanoid robot BigFoot is small in size, like a toy, controlled via laptop or joystick (or eye-gaze control in the future) and does not physically interact with the child.

1.1 Background Studies of the Appropriateness of Using Toy-like Robots in the Pedagogical Rehabilitation of Children with Autism

Robots interacting with children are being considered in several EU funded projects, both completed and ongoing. The project ALIZ-E designed scenarios for long-term interaction with social robots of children with diabetes in hospital settings [11]. The project MOnarCH focuses on modelling mixed human-robot societies [12]. The project DREAM is using NAO and Probo (elephant robot) to communicate and help teach social skills to children with autism [13]. These projects emphasise the clinical relevance of robotic technology whereas we place the technology in the broader context of pedagogical and social communication in standard and special education - hence in inclusive education settings.

Intrinsic motivations [14] are being modelled in the FP7 project IM-CLeVeR, where agents - animals, humans and robots - are guided by internal drives for entertainment and socialization being more sophisticated than the basic survival drives. The intrinsic motivation is the attraction of a cognitive system to novelty, making a difference between novelty and surprise. IM-CLeVeR is embodying in i-Cub robot the intrinsic motivation of higher-level brains to seek new knowledge [15], thereby sustaining learning and self-improvement in the course of life.

CybSPEED Action emphasizes a similar to the intrinsic-motivation approach to learning by designing human-robot situations (games, pedagogical cases, and artistic performances) and advanced interfaces where children and students interact with the novel technology to enhance the underlying self-compensation and complementarity of brain encoding during learning.

2 Functionalities of the BigFoot Robot

One of the main elements of the proposed CPS is a 3D printed walking robot which was called BigFoot in games with children, built on the basis of a min-imalistic principle via 3D printing technology [6, 7, 10]. 3D printing technology is entering many fields today, including robotics, because the technology is affordable and allows the rapid creation of functional models [16, 17]. Children with learning difficulties enjoyed playing with both humanoid and nonhumanoid robots such as NAO and BigFoot within the METEMSS project [18, 19]. Here we present the further development of BigFoot as a tool in education towards acquiring novel cognitive and social skills.

FigFoot has only two engines, but at the same time several functionalities - it can walk, turn at 360\(^{\circ }\) and even overcome obstacles. The robot consists of a round base 1 on which the body 2 is mounted (Fig. 1). The drive elements and electronic components are located in the body 2 of the robot.

Fig. 1.
figure 1

Two views of the BigFoot robot.

Fig. 2.
figure 2

Robot actuators. Left - walking mechanism; Right - rotation mechanism.

The possible movements of the robot are of two types - moving forward or back-ward with stepping and changing orientation:

  • The stepping is performed by a DC motor 3, which drives the gear shaft 4, the arms 5L, 5R and the steps 6L, 6R by means of a gear. The steps move in an arc from a circle and, thanks to the gears 7L and 7R, maintain a constant orientation relative to the round base 1 (Fig. 2 left). Thus, the robot goes through two main phases: a fixed circular base 1, in which the steps are moved, and fixed steps, in which all the other elements are moved.

  • The change of orientation is carried out by a DC motor 8, which rotates by means of the gear 9 the body of the robot relative to the fixed circular base 1 (Fig. 2 right). This movement is possible only in the first phase in which the robot has stepped with its round base on the ground.

The main mechanical components of the robot are made with a 3D printer. Detailed information on the movements, kinematics and overcoming obstacles by the robot is given in [6,7,8,9]. The dynamics of the robot is studied in [10]. Here the sensor system of the robot is presented, which is being developed further in order to adequately apply it to the newly formulated educational scenarios.

Sensor System of BigFoot. A system of optocoupler 10 and disk 11 with a hole is used to count the steps. A second optocoupler 12 detects the rotation of the robot body 2 relative to the circular base 1. There are 12 holes in the base, which allow the rotation to be read by 30\(^{\circ }\). Three analog infrared (IR) distance sensors 13 are mounted in the designated front of the robot. They allow distances to obstacles in front, or to the sides, of the robot to be registered and measured. There is a color sensor 14 in the body 2 of the robot.

Fig. 3.
figure 3

Software management dialogs. Left - settings panel; Right - main panel.

Interactive software has been developed to manage the robot movement, based on the LabView language [20]. Figure 3 shows the two dialog boxes of the platform.

In the Setting panel, Fig. 3 (left), the user can change basic parameters of the operation of the robot, as well as to add and disable functionalities. The communication between the robot and the control computer is carried out remotely via Bluetooth, which creates a virtual serial port. In this panel, a specific “Serial port” can be selected, as well as a specific address (ID) of the robot with which it will communicate (“Unit address”). The platform allows communication with multiple robots (up to 9), and the access to a specific robot is through a specific identifier with numbers from 1 to 9. Usually the control is performed by means of the keyboard, as the Up/Down movements are done with the Up/Down arrow keys, and the Left/Right rotation is done with the Left/Right keys. The control also allows the use of a joystick. If one is enabled, it can be selected and activated from the corresponding menu. As the platform is designed to work with different robots, which in some cases work with different types (e.g. parameters) of electric motors, there is space for the option to change the frequency factor of the pulse-width (PWM) modulation (“PWM freq.”) in the Settings panel.

The selected infrared distance sensors allow the registration of objects (obstacles) in the range of 10–30 cm, and proportionally the output voltage of the sensor varies in the range of 0.2–2.5 V at a supply voltage of 3.3 V. In order to set the maximum permissible safety distance to an obstacle, the voltage (respectively the distance) is set in the field “Obstacles level trigger”, below which an alarm signal must be generated and a response should follow. In other words, this parameter binaryizes the analog input of the IR sensor and generates an event after the voltage rises above a certain value.

The color sensor mounted in the robot base allows a wide range of colors to be registered, but for training purposes and easier interaction with the environment, the platform is programmed to recognize the three primary colors (Red, Green, Blue) plus Yellow. Due to the sensitivity of the sensor to the surrounding (parasitic) light sources, the sensor allows it to be adjusted by changing the Gama coefficient, Gain and integration time. Each of these parameters has its own control and can be changed by the user. To activate the recognition algorithm, from the drop-down menu “Color search”, the user selects a specific color. The currently registered color is presented in the large square field of the “Color sensor preview”, and the name of the recognized color (if any) is represented by the field at the base of the square.

The “Main” panel in Fig. 3 (right) represents the main interface in which the interaction with the environment takes place. All the sensory information coming from the robot is presented with the animation adapted according to the functionality, as well as the movements according to the selected direction. On the left and bottom of the window there are two sliders in yellow color, with values 0–100% setting the filling factor of PWM, which can adjust the rotational speeds of both engines, respectively, for walking and lifting the legs. At the base of the Main window the status of each of the three IR distance sensors are visualized as electric lamps. When an obstacle appears below the limit set in the Setting menu, the lights turn yellow and start flashing. The value of the measured distance is visible in a window inside the indicator icon as well as by a linear indicator on the side.

The result of the currently registered color is presented in the form of a large human eye that dynamically changes its color. When the specific color recognition function is activated and the robot steps on such a colored surface, the background of the panel of normally pure white is dotted with a color picture. When the robot is controlled and moves in a certain direction, its movements are animated, and the animation changes depending on the direction of movement. As mentioned, the steps and angles of rotation of the robot are registered by means of optocouplers, and their values are presented in the respective fields “Steps” and “Turns”.

For the implementation of different game scenarios the panel contains a timer counting down the time to perform a certain task. The function for automatic timer stop is provided in cases when color recognition is activated. In this way, the time from a certain starting position to reaching a target position marked with a certain color can be accurately measured. By starting the timer, the number of steps can be reset. At the end of the interval the complexity of the movements is taken into account and the approach to the target point is analyzed. A function is in the process of development where each step is recorded as a control algorithm. Subsequently, this algorithm can be used to automatically execute the recorded trajectory. So, the user will be able to set programs for subsequent self-control of the robot.

To demonstrate autonomous control, the software implements an algorithm for automatic control with bypassing obstacles without user intervention. When the function is activated (via the “Manual/Auto” button), the robot starts moving in the right direction until obstacles appear in its path. Then it turns in a certain direction or steps back to overcome it.

3 Educational Scenarios with BigFoot

The concept of the script is based on the interaction between two children and the walking robot. A simple scene is created in which the starting position of the robot is marked. There are several possible positions of the target to which the robot controlled by the user should move. The positions are hexagonal holes Pi (i = 1 \(\div \) 6) in the rectangular area (Fig. 4).

Fig. 4.
figure 4

Scenarios for robot movement.

Two children take part in the game - robot operator and goal setter. The goal setter (3) uses colored hexagrams (4) which s/he puts in position Pi. The hypothesis is that the following elements are learned: directions, distance, color, teamwork. Socialization. The robot can measure: time to complete the task, finding the right color, number of steps (distance), and direction. A task can be formulated: go with the robot to a hexagram of a certain color. The hexagrams are of the four main colors. The user (5a) controls the walking robot by means of a laptop (6) or a tablet in order to reach the goal (Fig. 5).

Educational Game Scenario with BigFoot. A game scenario has been developed with the participation of two children and a teacher. A child’s goal is to place a hexagram tile of one of the following colors yellow, red, blue or green on a board with holes. The other child’s goal is to control the robot via a computer to make it step on the tile and recognizes the color with the color sensor (Fig. 5). When the goal is reached, the child, who placed the tile says “Bravo, you succeeded!”, and in case of failure s/he says “Try again!”. This is an incentive to encourage the child to play, to enjoy the game and not to give up in case of failure.

Fig. 5.
figure 5

A concept for interaction between two children and the robot.

Intrinsic motivation is an important factor in the acquisition of new knowledge and skills by children. During the game, the teacher observes the children’s actions and, if necessary, mentors or helps them. At the discretion of the teacher, the two children change roles. The child, who controlled the robot, will place a colored hexagon tile on the board, and the other will control the robot. The expected result of the game is that the children will improve their communication with each other and, or if they have difficulty recognizing the directions or colors, the joint game will help them learn new knowledge.

The main advantages of the robot game are the following: Cheap, simple management, climbs obstacles, collects information about the quality of tasks. The game puts children in different situations - to set tasks or control the robot to complete a task. Moreover, the scenario stimulates the three main therapy tasks in cases of autism, imitation, joint attention and turn taking [21,22,23,24].

4 Evaluation of the Fitness of the BigFoot Scenarios for Pedagogical Rehabilitation

From a systems perspective the so called evolving design of educational situations (games) for children with special learning needs, describing the transition from one experiment in real-life conditions to another, not just from pilot to real-life testing was tested with one of the early designs of the BigFoot robot [18]. The architecture of the robot has allowed modifications requested by teachers in line with the needs of the child. BigFoot scenario was one of the successes with children both in standard and in special education. Children, playing the game, feel empowered to control complex technologies and solve sophisticated tasks. Moreover, a child with autism, who was avoiding human presence, let the therapist convince him to play with the robot. The child enjoyed the sessions with the robot. In the present work we are investigating the potential of the BigFoot to enhance social communications of children with autism, playing in turns.

Table 1. Teacher’s ratings of the game

At the moment, the empirical evaluation of the novel scenario is being performed. The Ethics Committee for Scientific Research (ECSR) of IR-BAS gave permission to conduct the study with Protocol 4/10.02.2022. We present here the data from 2 cases - 2 children with high-functioning autism (ASC) playing the game, and 2 neurotypical (NT) children, playing the same game.

Table 1 presents the Likert scales, which parents/teachers filled in during and after the game. Figure 6 present some basic comparison of the obtained mean scores.

Fig. 6.
figure 6

A concept for interaction between two children and the robot.

The results can be interpreted as follows: All parents/teachers consider the game as highly appropriate for the overall development of their children. We believe that it can be commercialized and standardized for use in education. Difference is observed in the item on child’s motor development. Parents of NT children does not rate it high for child’s motor development since children are probably more involved in other activities, unlike children with ASC. The main difference is in the score for self-initiated social contact (SISC). Parents/teachers declare lower rate of SISC in ASC, yet they believe the game addresses the issue in a very appropriate way. The game will be further tested with other groups of children as well.

5 Conclusions

The paper presented the currently designed novel educational scenarios with the walking robot BigFoot from a cyber-physical system perspective to pedagogical rehabilitation. The sensor system of the robot, developed towards following child’s strategy in the game and performing high level analysis of the observed child-robot interaction is presented. The results of the pilot study demonstrate a significant potential to improve the imitation, eye-contact and turn taking skills of children in a positive, encouraging learning educational environment.