1 Introduction

The beginning of twenty first century introduced to our lives numerous new technologies that assist people in everyday duties and leisure. The new technological communication is integrated into our daily routines [9]. Many people feel uncomfortable without smart devices and installed applications because they offer a wide range of informative content, entertainment, and even allow the user to monitor and collect body condition parameters just to mention possible medical context of mobile systems. The term “smart devices” may refer to several different types of electronic equipment, starting with devices that are principally manipulated by individuals, such as smart phones or smart watches, to the constitutive elements of so-called ubiquitous computing, that is an environment with pervasive sensors and information-processing capability [14]. In the past, the general functions of a cell phone were limited to calling and sending text messages. In the modern world of today, a smart device became the most popular technology, it is a multifunctional device that not only allows to communicate but also helps to learn and have fun. This is made possible by the development of smart device applications [10, 16].

A strong advantage of smart devices equipped with applications when compared to human skills is the possibility of non-stop work and, in particular, the ability of constant monitoring and focusing on numerous signals, their sequences etc. Events identified by an application installed on a smart device and used to activate an action of the device may occur in synchronous and asynchronous ways. Conditions for triggering an action may include complex logic rules, and usually they are formulated in a rule system. Circumstances for running a defined procedure may be controlled in dimensions of space and time but also some special actions may conduct the application into sleep mode. A set of activation rule components may be extended by additional objects, which are detectable to a smart device. An example of such technology are Estimote Beacons. These are small wireless sensors often in the form of stickers that can be attached to any location or object. They broadcast tiny radio signals that are detectable by smart devices and can be interpreted to unlocking micro-location and contextual awareness [12].

The goal of reported research and experimental work was to build a system which could assist an autistic child in learning pronunciation and meanings of new words. This process is typically conducted by parents who must systematically repeat a particular word to the child for a couple of days; in order to teach the child how to use the word. The difficulty of the teaching process lies in the fact that it must be conducted systematically, with love and patience during a period of several days. Because of limited oral communication with an autistic child, saturation of the process is hard to measure. Therefore it was decided to build a system that would stimulate and engage the entire family of an autistic child. Parents were expected to select new objects (new words) and mark them with beacons while the child listened to the parent’s voice pronouncing the new word when the object was detected at a very close distance to the child. In order to maximize effectiveness of teaching, the parents’ voices were recorded while pronouncing the new words. This prevented autistic children from hearing new, unknown voices, and made the children feel safer. It was confirmed, and assured the long-standing notion, that a familiar voice or sound puts the autistic child at ease. Smart phones or smart watches were selected as a platform for the children’s interaction and for playing parent’s voice while space and object awareness were built by attaching individual beacons to certain objects in the house. To make a clear direct association between the audible voice and the named object, the mobile device displayed a photo or a cartoon image of the object, and the autistic child had to touch the screen in response to played voice and displayed image. Children who participated in the experiments considered this interaction as a game. Alternatively, beacons could be installed to adopt an active role. In such a configuration the beacon (or a device connected with it) would raise an action whenever a user with a corresponding application would appear in proximity range. Although this approach with an active role of beacons is under consideration, the first experiments were conducted with beacons installed on house devices like a refrigerator, a computer or a table. The active part of the interaction was played on a smart phone or a smart watch handled by a child.

The scope of the paper is the following. After a short introduction, a chapter explains autism complex disabilities. Section 3 contains a review of related research work and papers. Then, in section 4, there is a brief definition of a Let’s Play application from software requirements to presentation of screenshots. Chapter 5 describes our experiments conducted with two autistic children who played the application in a controlled environment. A short summary concludes the presentation.

2 Concept of autism

Leo Kanner [23] was the first to describe autism symptoms after noticing a shared lack of interest in other people in a group of children who had previously been diagnose with various problems including simply mental retardation. Since that first recognition of ‘Early Infantile Autism’ scientific and medical world made a big step ahead in understanding autistic symptoms [18]. The term autism refers to Autism Spectrum Disorders (ASD). Autism is one of the most common developmental disabilities. It affects people of every race, ethnic group, and socioeconomic background. ASD is now defined by the American Psychiatric Association’s Diagnosis and Statistical Manual of Mental Disorders (DSM-5) as a single disorder that includes disorders that were previously considered separate – autism, Asperger’s syndrome, childhood disintegrative disorder, and other pervasive developmental disorders not otherwise specified. The term “spectrum” refers to the wide range of symptoms and severity. Among the others, there are brain-based disorders that affect behavior, as well as social and communication skills. Although the term “Asperger’s syndrome” is no longer in the DSM, some people still use the term, which is generally thought to be at the mild end of autism spectrum disorder. [3, 21, 27].

The Centers for Disease Control describes ASDs as: developmental disabilities that cause substantial impairments in social interaction and communication and the presence of unusual behaviors and interests. Many people with ASDs also have unusual ways of learning, paying attention, and reacting to different sensations. The thinking and learning abilities of people with ASDs can vary—from gifted to severely challenged. An ASD begins before the age of 3 and lasts throughout a person’s life [2, 7]..

Children or adults with ASD might [7]:

  • not point at objects to show interest (for example, not point at an airplane flying over)

  • not look at objects when another person points at them

  • have trouble relating to others or not have an interest in other people at all

  • avoid eye contact and want to be alone

  • have trouble understanding other people’s feelings or talking about their own feelings

  • prefer not to be held or cuddled, or might cuddle only when they want to

  • appear to be unaware when people talk to them, but respond to other sounds

  • be very interested in people, but not know how to talk, play, or relate to them

  • repeat or echo words or phrases said to them, or repeat words or phrases in place of normal language

  • have trouble expressing their needs using typical words or motions

  • not play “pretend” games (for example, not pretend to “feed” a doll)

  • repeat actions over and over again

  • have trouble adapting when a routine changes

  • have unusual reactions to the way things smell, taste, look, feel, or sound

  • lose skills they once had (for example, stop saying words they were using).

Until now there was no cure proposed for ASD affected people. However, research and practice of institutes, clinics and families prove that development of an autistic child may be significantly improved by early intervention and treatment services [17]. Basic communication skills and social competences may be achieved by increasing the use of words. The majority of autistic children have no problem with pronunciation but they encounter barriers in using language effectively and may have problems to catch nuances of body language and vocal tones [33]. For communication purpose, for instance to express needs when being with others or to give some signals while playing with siblings it is beneficial if a child can say even single words.

2.1 Communication difficulties

An autistic child often has difficulty understanding as well as communicating effectively with other people. As a matter of fact, the children on the autism spectrum, sometimes cannot see any motivation to communicate with others.

Consequently, an autistic child may be delayed in the acquisition of language skills, which leads to the disappointment when they cannot communicate with others to tell them about their needs. Therefore they may avoid joining a group in social situations. Thus, they have fewer opportunities for the acquisition of language skills. The autistic child generally has a complex problem with communication more explicit than speech and language difficulties. Usually, children on the autism spectrum may find it hard to interpret social behavior. Unwillingness to contact with people may be evident in the way they fail to make eye contact.

They also use hand gestures or body language to contact with others. Therefore, the delay in the acquisition of language skills is considered the best obvious indication for families to discover that something is wrong. However, the determination of criteria for an autistic child should take into account all types of communication and the child’s ability to interact easily and successfully with other children, not just language and speech. In many cases, the specialists found that autistic children have limited or no speech, and their understanding of another person’s speech varied enormously [25].

As a consequence of these conditions, the child needs more effort and time from therapists to develop and improve the listening skill, attention, and social skills. In some cases children have good language skills and vocabulary, so they can talk about a particular subject with great details. Also, there are some autistic children who have a difficulty with pronouncing words, using language effectively and many also have problems with word and sentence meaning [28].

2.2 Impairments in autistic children

The autistic child may have problems with any or all of the characteristics that participate in generating or understanding speech and language. For instance, because they are sometimes unable to understand social situations, they believe it is not necessary to communicate, or may have no understanding of how other people could respond to a communicated message.

The autistic child sometimes does not respond to auditory information or pay attention to it. Therefore, therapists spend a lot of effort and time to train them to pay attention to sounds. Even when they are paying attention, autistic children sometimes have a problem in decoding the meaning of words that are used in communication between other people and matching them with thoughts or objects.

In comparison with other autistic children it becomes more difficult because of a mapping problem. For example, they have difficulties with articulation of oral-motor functions (movements of the lips and tongue and associated breath control). On the plus side, however, the children with autism spectrum very often pay attention and appreciate visual materials.

Consequently, the visual methods are frequently on the best way to get access to autistic children’s minds and support them in expressing themselves [15] in turn. In any given autistic child, the distinction of which particular difficulty they face and which difficulty rattle them most in any stage of development can only be determined by careful assessment.

There is a standardized test that helps to determine the situation of an autistic child, but it needs careful administration and interpretation, in part, because these standardized tests were not developed with a consideration of the kinds of deficits that autistic children may have. Therefore, both the administration and the interpretation of such tests may be problematic because of the unusual pattern of performance.

3 Related work

The number of children diagnosed with autism spectrum disorders continues to grow [19]. According to an estimation provided by the Centers for Disease Control and Prevention 1 out of 110 children in USA has an ASD – 1 in 70 boys and 1 in 310 girls [7]. There are several support methods used to back children diagnosed with autism. Their effectiveness is empirically observed and compared. Among the recommended approaches there are clear instructions, repetitions, practice and reinforcement [4]. The use of this methodology is suggested to be especially effective for people who were diagnosed with ASD during their preschool years [11]. Eaves and Ho in [11] estimate that roughly 50 % of patients diagnosed with autism in preschool have a poor chance for independent living in their adult years. Other researchers report that the percentage of early ASD diagnosed individuals who had a ‘very good’ outcome in their adult life may vary from 4 to 12 % [5, 20]. Individuals with childhood performance IQ above 70 had significantly better outcome than those with IQ below the level [20]. Cited research confirms that there are a variety of interventions and support methods that have positive impact on children diagnosed with ASD and increase their chances for education and satisfactory, independent living in their adult age.

With the advent of the twenty-first century more and more IT tools have been developed to assist autistic children with their education. In order to attract children to use the didactic tool, some of the applications take the form of a game. For instance Piper at al. [30] designed a four-player tabletop cooperative game to support development of social skills. Children were observed to be attracted by the possibility of cooperation. A wide study and evaluation of assistive technology is presented in [31]. Reichle claims that it seems reasonable to hypothesize that the ‘high tech’ advantage may be specific to individual learners. For example, one could hypothesize that some learners may be distracted by some features of high tech equipment. On the other hand, other learners may find using the high tech application more motivating than using the low tech application. Riechle also points to the fact that the use of emerging technologies is a temptation for researchers to focus on demonstrating that the technology has a significant effect on learner performance. Thus, skills and gained knowledge are easier to verify if they are used in a different context, e.g. in a computer game and in real life interaction.

Farr et al. [13] report experiments where they used tangible technologies: Topobo and LEGO toys offered to two groups of children, typically developing (TD) and a group with Autistic Spectrum Conditions (ASC). They observed different style and sequence of play during TD and ASC children sessions. For both participating groups, there were more social forms of play with Topobo than with LEGO. More solitary play occurred for LEGO and more parallel play occurred with Topobo.

Mobile computers (including smart phones, smart watches) are especially predisposed to assist humans in their daily routines due to their lightness and easy portability the availability of multipurpose mode. Madsen et al. [24] used software installed on mobile computers to assist people with ASD in classifying human emotions based on images of faces. Hayes et al. [18] reported several interactive tools including cameras to assist an autistic child and a parent/teacher in creating, recording, collecting and producing materials for systematic work on vocabulary and scheduling in particular context and circumstances where the child is.

A wide set of reports can be found on using desktop applications, where social and communicative skills are on the second stage but they address the issue of understanding and using vocabulary [6, 8, 26, 34]. The ability to name objects, to understand what others say and to communicate with other people is crucial not only for an autistic child but also for the child’s family. A collection of computer games for parents and teachers of children with moderate to severe autism is available at [1]. Zakari et al., in their survey [35], compared functionality and quality of 40 serious computer games designed for autistic children between 2004 and 2014. They classified and assessed the games from four points of view: technology platform, computer graphics, gaming aspects and user interaction.

Games that assist children in their education are important and substantial segment of entertainment and education industry. Didactic nature of a game is an important attribute which parents take into account when they choose toys for their children. Parents’ preference to buy the toys providing a strong stimulus and supporting various developmental domains of the child is reported by Kabadayi in [22]. A particular example of an application similar to our proposal discussed in this paper is LearnMyWorld developed by Tehnoinfo [32]. LearnMyWorld is an application designed for devices running under iOS 7.0 (iPhone and iPad). The software is addressed to children aged below 5 years. The program allows an adult to create a collection of photos (for instance objects at home) and assign each picture to a recorded sound pronouncing object’s name or description. Child may interact with the application by tapping images which rises magnification of the picture and playing the sound/pronunciation associated with the image.

One should also take into account that many games and other forms of interactive tools require some level of communicative skills as well as intelligence and agility. These are abilities that people acquire with growth. Also, young children can only keep their attention on one process for a limited number of minutes. Time and operational complexity of a game must fit to the child’s age and skills. In many games addressed to children the didactic goal is aimed at developing agility and intelligence. Our goal is to provide a simple tool that could assist parents in frequent repetitions of limited number of words in order to enrich autistic child vocabulary.

4 Let’s Play application

4.1 System architecture and requirements

In this chapter we focus on the system architecture design, requirements, security issues, and implementation of the system. In order to explain the system functionality and its structure with details the following terms are used:

  • client: refers to a smart device with installed application on board;

  • user: is used to mean the parents or an autistic child depending on the context and used functions;

  • storage: is used to mean the File I/O.

The services are offered by the data storage and accessed via the application because the users are dealing with data storage and they are changing the setting from time to time. For that reason, the security measurements must be considered. Also, the security requirements that must be satisfied by the system, to protect it from being abused. The user of the smart device application is an autistic child. The parents should set the necessary information and make a request to get service from the data storage (Fig. 1).

Fig. 1
figure 1

Let’s Play: system architecture

Let’s Play application is designed in a way that it is used for fun and education purposes by autistic children. The application goes beyond simple graphics and gets the child engaged with the program. Children interact with the application on the smart device by touching a cartoon image or a photo displayed on the screen whenever a registered beacon is detected in neighborhood range. Additionally, the name of the object to which the beacon is attached, is played. To turn off displayed image the child needs to touch the display of client device. The basic character of skills required for operating the application is important because the application is used by very young children aged starting from two years. Systematically repeated parents’ sound pronouncing object name like things inside the house, transportation means, fruits, and animals is heard by the child and, after a period of time, the word and mental association with the real object should be remembered. The application works with smart devices that have Android 4.4 KitKat as the operating system or later versions, and this device must have Bluetooth Low Energy 4.0.

In comparison to other similar applications like LearnMyWorld [32] discussed in section 3, novelty of Let’s Play lies in several features:

  • Parents select a limited number of objects to focus learning by repetitions of single phrases during a period of time (e.g. within a week). The number of identified objects is limited by the number of possessed beacon identifiers attached to real objects.

  • When parents observe satisfactory acquisition of a taught word (the child can pronounce the word and understands its meaning), they may decide about applying used beacon tag to another object and make a new association with an image and recorded phrase in the application.

  • Parents have a possibility of defining periods (hours) of the application activity and time between consecutive repetitions of encountered object. The use of the application should not be annoying to children, and thus, the operating parameters that define application behaviour may be individually adjusted.

  • The application is trigged at a defined proximity to identified objects to ensure a possible association of the encountered real object with the image displayed on the mobile device and pronounced phrase. Location-trigged, or more precisely, proximity-trigged applications do not require a child to remember about the program but rather “wake up” independently when the child passes by a tagged object.

  • The application may be installed by parents on smart phones or smart watches. The possibility of using Let’s Play on different devices gives the parents the possibility to introduce changes into the didactic process. Using the application on different devices and changing the devices may attract an autistic child for longer periods of time and lead to learning more new words.

  • Statistics of Let’s Play use are recorded in the device memory for progress tracking and further analysis. Collecting logs from devices operated by different children allows comparison. One should note that the application may only collect objective data on number of repeated words and activity periods. The child’s progress in new words acquisition must be observed, assessed and noted by the child’s parents. The data collected by individuals may be used in diagnosis as well as in research work.

  • Finally, Let’s Play assists parents in repeatable routines. The application provides support for the parents in two aspects: systematically repeating taught word and displaying associated image, as well as it collects information including number of performed actions which can be used for diagnosis and planning further training.

Let’s Play is designed for use by young children to learn the meaning and pronunciation of new words. Simplicity of design allows for effective interaction with the application even by children aged 2 years.

4.2 Functional requirements

Main system requirements that define its functionality include:

  • Logging into the system: the user must be able to use the smart device to start the system, so s/he can request a service. Running the application is controlled by a parent by a password required to start the program.

  • Setting the application: the user defines a set of application behavioural properties like minimum period between consecutive repetitions displaying an image and playing object name pronunciation etc.

  • The client detects beacons’ UUIDs in a client’s neighbourhood by picking up a beacons’ signals and sending them to the data storage. The user makes associations between beacon’s UUID, an image to display and recorded sound.

  • The data storage checks the detected beacon’s UUID and confirms synchronization to the request.

  • Walking in selected area: an autistic child passes near the beacon with his/her client (a smart watch or a smart phone) within the specified region where beacons are detected inside the proximity range to get the information related to the object (image and sound). When a beacon is located its image is displayed on the screen and object’s name is pronounced – played from recorded parent’s voice.

4.3 Security requirements

In order to protect parents and children from unauthorized use, e.g. someone could record inappropriate pronunciation etc. the system should offer some basic protection against simple unauthorized use. The following are the security requirements for setting the system:

  • The data storage must be able to authenticate the user.

  • The data storage must only accept requests, which were created by an authenticated client.

  • The data storage should prevent the unauthorized changing the data setting.

  • The client must be able to authenticate the data storage.

  • The use of the user’s password should be minimized.

4.4 Installing the application and first time launch

An Android application is stored in an APK file, like (ApplicationName.apk), Accordingly the user must install the APK file on the client. Android application package (APK) is the package file format used by the Android operating system for distribution and installation of application software and middleware. The client must have an operating system Android 4.4 KitKat or later version to run the application. There are two different ways to install APK files on the client.

  • The first way is to download the APK files to the client, and here there are two options: either on SD-card or in the internal storage of the client. Next install the APK file by using the Application Installer. This method is easier for the users who have no experience with Android development. Also, it is useful for installing non-market applications.

  • The second way is to install the application is directly from the computer to the client. The user must connect the client to the computer via USB cable. The next step is running the application to see The emulator on the client. Then the user must approve the connection. The application will setup on the main screen of the client.

Launching the application for the first time, the user needs only one click on the application icon to start the application to get the service. The application will be ready to detect the beacons in a near range. The identified beacons appear on the screen in a sequence with respect to the distance from the client and the beacons. Therefore, the beacons that have stronger signal will be the first in the queue. This is because the signal strength depends on the distance and the broadcasting power value. Moreover, every beacon is defined with a different colour, mac address, major, and minor numbers. Once the data appear on the screen, the user can distinguish between beacons attached to objects in his neighbourhood (see Fig. 2).

Fig. 2
figure 2

Information about a beacon identified in a proximity range

A beacon is only an identifier attached to an object that is supposed to be encountered by a child. To ensure that the application recognizes the objects, the user (parent) must define the attributes associated with the beacon. The goal is to identify just one or a few new objects by the application in parallel to focus child’s attention only on the few selected and repeated words. In particular, during the first time launch a parent must define attributes associated with a beacon: name, sound and image for each encountered beacon, e.g. a chair (name = ‘chair’, sound = recorded word ‘chair’ pronounced by a parent and an image or photo of a chair). From time to time, which is dependent on the period of the learning process, the beacons can be attached to new objects which requires redefinition of an object with the beacon on it.

The interface of the application is in practice identical whether it is running on a smart watch or on a smart phone. As the first step, parents should select an object and attach a beacon to that object, make a photo of the object or download an image of the object to the device. Then in step two, parents should record their pronunciation of the object’s name (see Fig. 3). It is important that parents record their own voices, which are already familiar to the child. In this way, one element of child’s anxiety can be eliminated. In the third step the operator (the parent) needs to locate the smart device in proximity range to the beacon in order to make an association between the beacon, its image and the recorded voice.

Fig. 3
figure 3

Setting attributes of an object identified by a beacon

To change the setting (attributes associated with a beacon), the user needs to enter the pin code in the field that has appeared on the screen, as shown on Fig. 3. The reason for security pin code is to ensure that only parents (or allowed person) are able to change the settings. We observed that autistic children feel more interested in using the application when they deal with the device and hears mother’s or father’s recorded voice. Unknown voices introduced unnecessary astonishment.

Setting the attributes of an object is performed in the following steps:

  1. (a)

    The user gives the object a specific name.

  2. (b)

    The user records the sound pronouncing the selected object’s name to make the autistic child feel more interesting; the mother’s sound is recommended.

  3. (c)

    The user chooses an image of the object; in this step there are two ways:

    1. (i)

      Take a picture by using the camera of the client,

    2. (ii)

      Go to the gallery to choose the picture from the gallery which is already saved in the device.

Finally, after specifying some more details of the interface and defining a set of procedures for the application (e.g. how long it operates in one session), it is ready to use, and the user can take advantage of all available resources for the application.

5 Conducted experiments

The application developed on the experimental basis was validated at houses of two families having autistic children. This application from the beginning attracted attention of these two children and their families, and they enjoyed using it.

In order to give the children a motivation for using the application, the application must be supported with colorful cartoon graphical user interface (see Fig. 2). Two smart devices were used: a smart phone and a smart watch in this experiment because the autistic children loved to play with these devices [29]. Since the application was developed with autistic children as instant testers, it was sure that that simple design, technological attractiveness and the form of a game applied in Let’s Play would make it possible to break the barrier of communication between autistic children and their families by helping to learn new words in shorter time when compared to learning without support from smart device and beacons.

5.1 Participants: a boy and a girl – their history and conditions

Four-and-a-half year old Hussein, is a boy born in 2011. The researcher observed during the first meeting that Hussein had difficulty communicating with his family. After the observation time, Hussein showed how his behavior changed when he started to use this application.

Hussein an only child, was diagnosed with autism when he was three years old. He could speak a few words with some difficulty. When he needed anything, he pointed to it with his hands. Also, when he tried to tell his mother about something and she could not understand him, he became angry and began using incomprehensible language.

The second child participating in the experiment was a 3.5­year­old girl, named Rawan. She was born in 2012 with autism spectrum disorder (ASD). The researcher observed that the girl started to use the application Let’s Play The researcher observed that the girl started to use the application “mama” in the Arabic language, this word recorded by Rawan’s mother was a part of the experiment.

The girl was diagnosed with autism when she was 2.5 year old. It is believed there is a connection between her autism disorder and blood poisoning from heavy metals. Rawan has an older sister and parents living with her. The girl does not have the ability to speak; she has the difficulty of communicating with family, except for her mother. In the beginning of the experiment Rawan could only say three words: mom, dad and Bano (her sister).

6 Teaching method

Let’s Play was designed to assist parents in requiring task of making frequent repetitions of the same word. Moreover we decided to limit the number of words which were repeated when the children meet objects while using the application. The number of identified objects is also naturally limited by number of available beacons. The rules that define the teaching method assumed in our experiments are the following:

  • Instant pronunciation of a word and image display when an object is encountered; applications remains silent for a minute when the child touches an object’s image on smart device, after one minute break the object can be identified again;

  • Up to 30 min session with the application; after that time parent takes smart device from the child;

  • 4 session daily separated by 2+ hours pauses;

  • Up to 5 objects identified by beacons per week; objects are distributed over entire house, not to be met together by the application. The child needs to move in order to meet another object.

  • The child is considered to have learned a new word when he or she reacts while hearing the word and knows how to pronounce it.

In both case studies, the children felt comfortable because their families understood what they needed, and they started to communicate more with family members. Engagement of the families into the experiment was motivated by their common will to enrich their children’s ability to communicate and name basic objects that they needed to use.

6.1 Observation procedures and teaching method

The experiment lasted more than 12 weeks, the period of observation started from August 1, 2015 and was concluded on October 31, 2015. For the first six weeks, both of the children used the application with smart watch. In the second six-week period, the smart watch was replaced with a smart phone. The activity time in the application was set to 30 min. This was the period in which the application detected beacons in proximity range, displayed the image and played sound when a beacon was encountered. Parents were asked to run the application four times a day, (but this was not strictly obeyed).

Seven days a week multiplied by four daily sessions produced a maximum of 28 runs of the application per week. Sometimes the children were bored and did not want to play with the application. In such cases the parents did not force their children to use Let’s Play. In data collected during 12 weeks of experiments number of sessions varied from 10 to 28 per week in case of Hussein and in Rawan’s case number of 30 min. Sessions waved from 13 to 24 per week.

In both observed cases, the children and their families used Arabic language. Hussein and Rawan learned words describing basic objects in their near neighbourhood like: a feeder, shoes, a chair, and a refrigerator.

In an interview before the experiment the parents estimated that time required for their autistic children to learn a new word was about 3 weeks of systematic repetitions. They could not exactly count how many times a day or weekly they repeat particular words and their teaching process was not focused on single words but they tried to name many objects when their children played or used them.

Using Let’s Play children had the opportunity to listen in systematic, repetitive manner to up to five new words (each family got five beacons to use in the experiment). Moreover, every time when the pronunciation of a word was played its image appeared on smart device and the child touched the screen to turn the image off.

6.2 Case study

In this section we present and discuss results and observations collected during the experimental use of Let’s Play with respect to the two children: Hussein and Rawan. Each of the children used the application on two smart devices – smart watch and smart phone. Although the interfaces were identical when the application was played on smart phone or smart watch, each child played with the Let’s Play on two different devices in order to observe whether changing the mobile device would have an influence on the child’s preference and will to play with the application. We also wanted to verify whether the different devices impacted the speed of learning. However, our experiment limited to observation of two children did not allow us to draw statistical conclusions.

During the experiment Hussein enriched his vocabulary. The process of learning new words was faster when compared to unsupported lerning from listening to words repeated by mother and father in everyday routines. It was clear that his motivation to communicate with the family improved by using smart devices with the Let’s Play application.

Hussein’s mother reported that the boy was increasing communication with her by using the words he learned from the application. At the same time his father reported that Hussein liked playing to find the beacon’s location. In the end of the experimental period, the boy tried to start communication with other people using the new words.

In Table 1, it is possible to find the number of new words that were learned by the first child (Hussein) while using Let’s Play on a smart watch during the period of the first six weeks. Also, the table shows the number of uses of the application per week. In fourth column one can find the learning rate per session defined below:

Table 1 Hussein’s progress during six weeks of Let’s Play on smart watch
$$ \mathrm{Learning}\;\mathrm{rate}\;\mathrm{per}\;\mathrm{session}= No. of\; learnt\; words/ Let\hbox{'}s\; Play\; sessions\times 100\% $$

Learning rate per session expresses the speed of learning. Although this factor may seem not precise because sessions are not equal (e.g. we do not know how many times a word was pronounced by smart device in each session) it gives an estimation on how many sessions were required to learn a particular word. This measure was widely used by parents in discussions and comparisons of children performance.

In Table 2, parallel data collected during second phase of the experiment with Hussein, when he used a smart phone as a device to operate Let’s Play, is presented.

Table 2 Hussein’s progress during six weeks of Let’s Play on smart phone

The data collected during the experiment with Hussein are also presented on charts (see Figs. 4 and 5). As we can see on Fig. 4, the speed of acquiring new words by Hussein was almost linear and similar in both forms of interaction with smart watch and smart phone. On Fig. 5 it is evident that after the initial attraction from the third week of using Let’s Play, the boy became bored with the application and used it less frequently. This may suggest the need of regular change of the device and form of interaction with the application to keep the child engaged and using the application as a form of entertainment.

Fig. 4
figure 4

Cumulated number of new words acquired by Hussein using Let’s Play during 6 weeks of experiment

Fig. 5
figure 5

Number of 30 min. Sessions using Let’s Play application by Hussein during 6 weeks of experiment

In Tables 3 and 4, data collected through observation of the second autistic child, Rawan, who also used Let’s Play application installed on smart watch and smart phone, is presented.

Table 3 Rawan’s progress during six weeks of Let’s Play on smart watch
Table 4 Rawan’s progress during six weeks of Let’s Play on smart phone

Rawan’s progress and frequency of using Let’s Play is also presented on charts (see Figs. 6 and 7).

Fig. 6
figure 6

Cumulated number of new words acquired by Rawan using Let’s Play during 6 weeks of experiment

Fig. 7
figure 7

Number of 30 min. Sessions using Let’s Play application by Rawan during 6 weeks of experiment

Rawan’s progress and speed of acquiring new words (Fig. 6) was slower than in the case of Hussein (Fig. 4). After the fourth week she did not learn a new word while using Let’s Play on smart watch. This might be explained by the fact that Rawan is one year younger than Hussein (3.5 and 4.5 years of age respectively). Such a difference in age has a meaning when assessing maturity and ability to learn even in case of children not affected by autism. It should be noted that in this experiment the two children were both of different ages and possibly different degrees of ASD. The number of sessions with the application per week was more stable in Rawan’s case compared to Hussein’s. The effect of getting bored with that application after several weeks of playing with Let’s Play was not observed in the girl’s case.

The last element derived from the experiment was speed of learning expressed by so-called learning rate. The value of this parameter (the higher the better) shows what part of a word a child could remember after one session with Let’s Play, or what part of a word remains in memory after one session, e.g. learning rate equal to 20 % (1/5) means that the child needed five 30-min sessions to learn a new word. Comparison of Figs. 8 and 9 gives some evidence on repeatable character of the number of sessions required for a child to remember a new word. Moreover one could observe that in case of Rawan (Fig. 9) using Let’s Play on smart watch in the last two weeks of the experiment gave no visible improvement in new acquired words. This may be a hint to work on proper selection of the device especially preferred by the child or to allow interaction using various devices to keep the child engaged in the application.

Fig. 8
figure 8

Hussein’s learning rate: what part of a new word remains in memory after one session

Fig. 9
figure 9

Rawan’s learning rate: what part of a new word remains in memory after one session

The lower number of new words learnt by Rawan when compared to Hussein’s performance may come from that fact that the girl was one year younger than the boy. Rawan’s parents observed that while she reacted to some words trained in a particular week of the experiment, she did not pronounce the word; thus, it was not counted as a learned word.

After the experiment, Rawan’s mother reported that the girl started to communicate with her sister by using the words she learned from the application. It was a wonderful experience for Rawan when she felt that her family understood her requests and empathized with her.

Although it was not verified in our experiments, it may expected that giving a child some weeks of break from using Let’s Play after several weeks of exercises might allow the child to forget the application. Return to play after the break could engage the child more then was observed during these experiments in which Rawan and Hussein used a smart watch and smart phone without a longer gap between the experiments. Such a rediscovery of joy from formerly used but hidden and forgotten toys is a well-known effect.

6.3 Parent’s observations and suggestions

Shortly after the experiment there was a meeting with parents who were asked to express and share their findings about functionality, usability and their overall satisfaction. Both pairs expressed their satisfaction and a will to continue using Let’s Play to assist them in repetitive routine of pronouncing new words several times each day. Parents pointed to several positive functions of Let’s Play and its teaching methodology:

  • Increased effectiveness of learning new words

  • Focus on limited number of words in a week

  • More systematic repetitions than usually done by humans

  • Parents were required to plan what they want to teach their child and prepare pronunciation and an image. (It cost some work but gave a very successful outcome)

  • Children considered using the application as a form of game

  • Children started using newly acquired words to focus parent’s attention on them and truly enjoyed it

Parents participating in the experiment suggested extending the application by a dictionary of words (pronunciation and images) in order to allow easier return to previously trained words. They had no remarks related to length of Let’s Play session (30 min.) which seems appropriate but the parents observed that from time to time children were bored and they decided to reduce the number of sessions per day.

7 Summary and conclusion

This research work was conducted to propose and discuss the novel use of assistive technology for children with autism. The solution is based on an application installed on a smart device and Estimote Beacons used to identify objects. An autistic child may learn objects’ names or sounds by hearing the words pronounced by a parent. The sound is played on a smart device and a cartoon image can be displayed repetitively whenever the smart device encounters and recognize an object equipped with a beacon. The smart device connects to the beacon met in its neighborhood via a low-energy bluetooth bridge, which gives an advantage of the distance tracking between smart devices and beacons. The application Let’s Play installed on Android devices assists a child and repeats the object’s name as many times as the child wants to play with the program. In this sense, the application helps parents to perform this repetitive task. However, it is for the parent to decide about the duration and scope of application use.

In recent years, a number of applications of assistive technology have been investigated in order to improve the functions of communication and enhance the language skills of autistic children, with the aim to address specific deficits, which encouraged this development of new applications. The proposed system is based on relatively inexpensive, easy-to-install-and-use solution. It requires a smart device (for instance: a smart phone or a smart watch) and set of beacons to be used as object identifiers. The didactic approach used in Let’s Play assumes active participation of parents who are expected to configure the environment by attaching beacons and recording the pronunciation of objects’ names. Let’s Play was validated within two families having autistic children. Experiments, although not extensive and performed within only a few weeks period, gave a positive outcome. Children liked to play with the application and they learned new words. It is planned to give Let’s Play to more families and collect more information on the effectiveness and speed of learning new words and enriching autistic children’s vocabulary and communicative skills. The speed of new words acquisition estimated by the parents before the experiment to 1 word per three weeks was strongly outperformed by the use of Let’s Play. Systematic repetitions of encountered objects’ names allowed the autistic children to learn from 0 to 2 new words per week in Rawan’s case and from 2 to 4 new words per week in Hussein’s case. One should take into account that speed of new words acquisition may vary because of child’s age and degree of autism spectrum disorder. The parents also reported that the use of Let’s Play focused them and their children on a few particular words in every week of the experiment. Such a selection and frequent repetitions led to good retention.

The mental, emotional and physical needs of the child should be taken into account when the user chooses the object for learning. There is a need to choose the most important objects and activities that would benefit the autistic child.

One of possible applications of Let’s Play assumes that the application could be installed on a smart TVs that is already deployed inside a special room dedicated for the didactic purpose. The beacon can be located inside the clothes of an autistic child. When the child walks in the room the receiver (smart TV) would find the signals that are broadcasted by the bluetooth device located inside the broadcaster (Beacon) within the proximity range. The application installed on smart TV could be calling the autistic child by his name to get his attention then provide him new information. For example, the child’s mother shows up in the movie and calls the autistic child by his name, then she repeats the words that are taught to child during the day, possibly showing the objects or the images.

The application still needs an improvement to get better results for autistic children. But at the same time, the users considered it a good help in developing communicative skills and believed it could improve the quality of their life.