1 Play and interactive play

We are going to look at Co-located Augmented Play-spaces (CAPs), or interactive play systems—we will use these terms interchangeably. The systems centre around providing forms of social and bodily play in a technologically enhanced space. In this manuscript we will focus more on room-sized spaces than urban play, and on systems that target play for multiple players. With the rapid growth of technological possibilities we have seen a variety of new types of pervasive play-spaces. These environments are used to specifically target the cognitive, social-emotional, and/or motor skill (development) domains [1, 2]. We will give an up-to-date overview of this research field.

We are not the first to give an overview of CAP-like systems: previously Magerkurth et al. described various Pervasive Games [3], Sturm et al. described various Interactive Playgrounds [4], Nijholt et al. described various Ambient Intelligence Environments [5], Stach et al. classified different Active Games based on the input [6], Schouten et al. described various Ambient Games [7], Poppe et al. also described various Interactive Playgrounds [2], and Malinverni and Parés specifically created a systematic review regarding Full-Body Interaction Learning Environments (FUBILEs)[8]. The authors and papers had different foci but all contained some examples of what we call interactive play. They also mentioned key issues for the design of and research into playgrounds. We have built on these works, extended and brought together related work, and we have borrowed parts of their lexicon.

The featured literature was collected during a research project on Ambient Entertainment that started in 2011. Google Scholar, ACM Digital Library, and Springer Link were used as primary search environments. Google, Vimeo, and YouTube were used as well, to also familiarize ourselves with non-scientific work. We contacted and communicated with several companies working in this field to broaden this knowledge. Search terms included, but were not limited to: interactive playgrounds, interactive play, ambient entertainment, and embodied interaction. Several students were assigned to perform additional searches on related topics, which provided us with a broader view on the field, and also pointed us to relevant research. We did specific searches into questionnaires, recurring authors and research groups, and we performed directed snowball sampling, that is to say we looked into referenced work filtered on title, familiarity, and citation. This resulted in a collection of 435 research papers, 5 books, 4 Ph.D. theses, 4 technical reports, and several movies, leaflets, and websites. The literature included in this survey was selected based on a mix of their fit to the themes, the structure of this survey, and the recurrence of citations. During the review process we removed one and added a further 18 research papers to emphasize and elaborate on certain aspects.

This survey is structured as follows. We will finish the introduction of this survey by elaborating on play and interactive play. We will then have four sections dealing with both the end user’s perspective and a researcher’s perspective. We start by discussing several end user’s goals that have been targeted with the introduction of the systems (Sect. 2). This will be followed by an overview of CAPs, the intended use of the systems strongly related to an end user’s perspective, and an indication of their physical form (Sect. 3). We will then turn towards a researcher’s perspective discussing several ways in which evaluation of these systems has been performed (Sect. 4). The next section is focused on the research’s perspective: to categorize the types of research contributions that resulted from designing and investigating these systems (Sect. 5).

In this survey we did not focus on the idea creation phase of design. We welcome future work on this topic but it was outside the scope of this paper. This is due to two reasons. The first reason is pragmatic: it was not the focus of our recent research efforts in the domain. The second reason is besides pragmatic perhaps more provocative: from the research perspective, we have seen that the idea creation phase can be omitted, quite a number of cited papers investigate existing systems. Nonetheless, this paper does provide an overview of research and systems that can be informative in the (idea creation) design phase.

We will finish the manuscript with a section on explaining what we see as promising directions for future research in this field, an intervention based play research approach, a direction that we think could better bring together these different aspects of interactive play (Sect. 6).

1.1 Play

In this survey we refer to play as a social, bodily activity that people (partially or primarily) engage in for fun and entertainment. Play in that sense has been researched for decades. Best known are the early works based on analysis of (human) cultures, language and practices by Roger Caillois, and by Johan Huizinga. Both authors explain that there are many different types of play including but not limited to goal-oriented outcome games, cultural performances, and games that simply stimulate the senses [9, 10]. Both authors view play as being omnipresent in our nature and culture. Both the developmental psychologist Lev Vygotsky and Jean Piaget referred to play as being an important element in the way children develop, although the two have different views/theories on (the stages in) children’s development [11, 12]. Iona and Peter Opie also did essential work in researching play in the second half of the 20th century, with the archiving, collecting, recording, and analysis of children’s play and tradition in the UK. We refer the interested reader to [13] in which the Opies’ work is compared to current day play in the UK. Recently Jaakos Stenros wrote a thesis on the spectrum of playfulness, play, and games, with an elaborate review of definitions and positions of these and other authors [14]. Based on this work, from our focus and point-of-view, we see play as ranging from structured play with non-changing rule-based games to open-ended play which is more frivolous, imaginative, and non-deterministic. Both ends of the spectrum have their benefits and downsides with regard to what effects play can have outside the activity itself, for example, stimulating creativity, improving cognitive development, learning social skills, or (better) enhancing physical skills.

1.2 Interactive play

Interactive play allows for enhanced play experiences by combining traditional play with advances in technology [1, 15]. We think that true interactivity is more than simply turning a product on or off and instead requires a dialogue of actions and reactions [16,17,18]. Interactive play is more than electronic toys such as remote controlled objects (drones, cars, and balls), light sabers, and walkie-talkies. Although such electronic toys also combine technology with playful activities, we see these electronic toys as inherently different from interactive play systems. Looking at the field of interactive play, we see 4 elements that together separate interactive play from this type of electronic toys. First and foremost, all systems that we include in our definition explicitly require body movement for interaction, creating an embodied interaction that is different from the interaction required by computer games played with a joystick, mouse or touchscreen [19, 20]. The systems respond to this movement-based type of input. Second, the feedback is enhanced, more than just the physical impact of the movement. The timing of the feedback is ‘direct’ thus not after the entire interaction, and the feedback is offered in gradual forms, for example, lights/visuals in different colours, a variety of sounds, and movement/vibrations in various intensities [17, 18]. Third, there is some history of state: for example, the system remembers where a player was standing a few seconds ago in order to switch between the states or to keep a score [21]. Fourth and optional, depending on the type of device and the goals, systems can be made more interactive by sending and comparing the states of multiple devices/players (between devices) and this provides more opportunities for play with multiple players, for example, turning on the lights around another goal once a player has passed a defender and has scoredFootnote 1 [23].

Besides promoting interactions and providing pleasing forms of feedback, interactive play systems can sense, detect, and observe behaviour of the user. This allows us to intervene during play and to adapt the game based on the players’ interaction and performance [2, 24,25,26].

2 Argumentation for interactive play

Now that we have introduced the elements of interactive play that were derived from the literature, we will further explain goals that are targeted with interactive play as found in the included papers. Systems often target several of the following goals simultaneously. The goals can be linked to an end-user perspective, answering questions such as: What positive effects can the system have for the end-user? Why do we as a field work on this topic? Later, in Sect. 5, we will focus on what the contribution can be from a research perspective, describing several kinds of contributions that studies and papers have added to the body of knowledge. The set of goals from an end-user perspective is similar to that mentioned by Poppe et al. [2]. We have revised it to mention stimulating (distributed) social interactions and (sport) skill development in a more prominent way. We have excluded ‘behaviour change’ as we view this as a means to promote goals, not an end in itself. We also omit diagnosis, as we have not yet seen playful interactive systems doing this, although we agree that this might form a new and promising direction for CAPs with their multimodal characteristics and we are currently starting first explorations in that direction.

2.1 Stimulate physically active behaviour and sport skills

Children are used to playing with digital entertainment, which also leads to children spending more time with digital games [3].Footnote 2 There is an overall trend that has caused people on average to adopt a more sedentary lifestyleFootnote 3 [27, 28]. Introducing technology to make movement based playful activities more appealing could help to (partially) counter this trend [29] as it seems to be a promising way to encourage children [24, 30, 31], teenagers [32], adults [33], and elderly people [34] to move more at least on a short-term basis [35]. A few warnings recognisable in the work of Marshall & Linehan for providing a transparent argumentation related to physical activity is to not overestimate (long-term) effects of exertion games, to recognise the importance of food intake when considering weight loss, as well as to recognise that discouraging certain health-related behaviours can go against what users actually want [36]. Marshall & Linehan also point out that researchers active in the HCI domain should be careful in interpreting the literature from other research fields. They advise against the use of the obesity epidemic as a rationale for promoting exertion games, and instead mention that exertion games (and trying to stimulate physical behaviour) can have other benefits.

A second type of stimulation of physically active behaviour is focusing on physical skill development. In Japan it has been shown that some types of physical ability have been declining in the last decades as well [27]. This skill development can be stimulated with simulation of sport elements, adding motivation with game elements, incorporating ways for improved reflection on performance, and quantifying player progression [30, 37,38,39,40]. A goal of interactive play systems can also be to create a motivating activity in the rehabilitation process, where the systems help players to (re)gain skills that have declined through health problems [41, 42].

2.2 Stimulate social interactions

Digital entertainment compared to traditional play might lead to fewer social interactions—more children are interacting through and with their technology (e.g. mobile phones) at the same time being together but alone: ‘Alone together’ [43]. Turning technology from a problem into the solution, well-designed interactive play could instead also increase social interactions by stimulating player interactions directly by giving players different roles [26, 44] or by starting discussions about games, sharing interpretations of interactive elements, and stimulating negotiations regarding resources or rules [1, 15, 45, 46].

A subclass of stimulating social interactions consists of stimulating social interactions between people that are geographically separated. Often this is combined with exertion interfaces, ‘an interface that deliberately requires intense physical effort’ [47, p1], often based on sports. Combined with the distribution this becomes Sports over a Distance, a category of systems that attempt to break away from social isolation and sedentary behaviour that seems to be supported by traditional digital games [48]. Such systems include technological ways to provide augmented sports, such as joint jogging [49, 50], kicking/throwing a ball against a wall [51], (kick) boxing [52], and table tennis [48]. Some systems also provide ways for haptic feedback, such as a game of tug of war [53] or arm wrestling.Footnote 4 This is primarily a different goal than the previously mentioned stimulation of actual sports movement, as it uses sports to get people to interact socially over a distance, instead of being focused on training certain abilities. Nonetheless, it is important to realize that many pervasive play-spaces often target several of these goals simultaneously.

2.3 Improve (children’s) cognitive development

Play is important for the development of children in the physical, social-emotional and cognitive domains [54, 55]. By interacting with other children, they train negotiation and social skills. Cognitive skills are often achieved by creating and adapting game rules, scenarios, and characters [54,55,56]. It seems that introducing technology into traditional play could also aid in children’s development. Various design strategies for creating interactive play systems fit quite well with current psychological models about learning [8]. Some installations explicitly build on these models to create interactive playgrounds that explain mathematical notions such as bar charts [57] or algorithms [58, 59]. The installations can also be applied for explaining other educational topics such as geometry, physics, geography, music concepts, and language, or for understanding more moral topics such as environmental issues, cultural diversity, and social justice [8, 60]. Furthermore, they can be used to show the relation between educational elements, for instance showing that science is a network of knowledge [61]. A variety of interactive play systems also try to stimulate creativity. A well-known approach is open-ended play or emergent games, in which interactive elements provide an emergent space in which players are stimulated to create their own goals, games, and adapted rules; instead of strictly prescribing games and how they should be played by their rules [1, 5]. This is an approach that is related to open-ended interactive art works which are not completely defined by an Author/Artist but rely on the interpretation of the reader/visitor [62].

2.4 Provide joyful experiences

A fourth reason that is mentioned is hedonistic, a focus on applying interactive play in order to provide a (new) fun experience, perhaps improving well-being (indirectly) with positive effects for the general health of the players, or simply for commercial reasons [5].

Beautiful music, splendid landscapes, mesmerising scents, and pleasing fluffy materials are all well-known ways to provide such an experience. Food intake is another important way to provide such a joyful experience. Although currently uncommon, edible interactions can be used to augment interactive systems [63] and vice versa [64], and food intake has proved to be a interesting stimulus to investigate multimodal hedonic experiences from a neurological point of view [65].

Fig. 1
figure 1

An overview of categories of (Co-located) Augmented Play-spaces and electronic toys, with a focus on the three types of interactive play-spaces: (1) interactive toys, transportable devices with included sensors, (2) interactive environments, larger environments equipped with various sensors, and (3) geo-location devices often mobile phones, with which games are played that are not constricted to a space, collocation, and can be played asynchronously. Some examples of games/systems are close to another (sub-)category. We placed those systems close to the borders. For this categorization we included the so called head-up games in the playground props category. We used Adobe Photoshop CC 2015 to create this figure

3 Types of interactive play systems

A variety of interactive play systems have been developed in the last two decades.Footnote 5 Other papers mention such systems with categories based on the type of input including physical characteristics (e.g. type of action or controllers and (physiological) sensors) [6, 7],Footnote 6 game genre (e.g. affective computing) [3], goals and hardware capabilities [2], or devices, scale and interaction [5]. We have organised the systems according to the physical characteristics, similar to Sturm et al. [4], where we extend the description of the categories and include more (recent) systems.

As well as this categorization at the end of this section we also include an overview of which output modalities are used in the referenced papers. This informs designers on what is currently being used and which new modalities might be explored.

Roughly we see two main lines in research on interactive play that do fit our focus. Interactive toys, where objects are augmented with interactive elements, and interactive environments, in which the surrounding playground is also equipped with additional sensors or additional means of providing feedback. This split is not a dichotomy but a somewhat blurry distinction, where some interactive toys might rely on sensors in the environment and some toys can be introduced into interactive environments. In general the toys allow for more mobility of the installation and can be cheaper, the environments often seem to be more expensive but could allow for a more easy stepping in and out of the game [68, 69] or a ‘show up [..] and play’ approach [48, p3] in public spaces. Besides these two main lines there is the topic of geo-location games that we only touch upon. This topic is quite different because unlike other Co-located Augmented Play-spaces (CAPs) it less often requires co-located social play. For a quick overview with three described examples per category see Fig. 1.

We exclude certain things and focus less on certain topics, even if they are interesting, because they do not fit into the core of this paper. We only include a few interactive art installations and interactive play systems intended for museums. The body of work on these is much larger than represented in this survey. Some do not fit the core of this paper because of the lack of gradual input and feedback, others focus on providing a message instead of providing active embodied play. We also only include a few active video games, ‘this form of game integrates the entertainment of playing games with the physical interaction of the user to control the game play’ [70, p21]. This term is used mainly in health related domains [71], and the games are (often variations on) movement-based console games for existing systems such as the Wii, XBox Kinect, and Playstation Move. These games only require movements to a limited extent in a small physical space. Still, they share much of what is discussed in this survey so far, and they can provide relevant results when incorporated in studies [19, 70], which is why several papers are included in Sects. 4 and 5. We have excluded most (interactive) fitness equipment as this is also not made for co-located social interactions with various types of bodily interaction. We have also excluded interactive pedometer systems and physical activity apps such as Strava and Runkeeper, and related game-like research attempts that include persuasive elements (e.g. [72]). Some of these activity-trackers might largely adhere to our description of interactive play systems and future systems might even fall within the domain. Nonetheless, they form a quite different area of research as many omit continuing ‘direct’ feedback and only provide ‘feedback’ before or afterwards on request [50],Footnote 7 often providing a one-on-one performance representation afterwards based on a more limited variation of actions to be taken. And they are less playful.

Another type of installation that is not included, only due to a lack of existing work we could currently find, revolve around augmented physical Escape-the-Room games. However, this would fit perfectly with the type of systems in this manuscript. Real-life escape rooms seem to give opportunity to address the earlier mentioned goals in a playful manner, especially addressing more specific topics such as learning about team cognition, awareness, verbal and non-verbal communication seems promising [73]. There was also an apparent rapid growth in popularity of these rooms.Footnote 8 However, currently many real-life escape rooms only incorporate basic levels of technology [73]. We know of only Pan et al. and Shakeri et al. that focus on actively incorporating technology beyond this basic level [73, 74]. Depending on the implementation, future systems like these could be seen as a separate form of interactive environments mainly as fixed interactive objects (the rooms) that can be combined with interactive screens as well as playground props.

3.1 Interactive toys

There is a variety of interactive toys, objects that can be carried and which are enhanced with interactive elements. Due to the differences between them there is also a variety of terms to describe them. We will use the following set to categorize the interactive toys: tabletop tangible interactive toys [3], handheld playground props [1, 4], wearables [22], and semi-portable playground props [21, 31].

3.1.1 Tabletop tangible interactive toys

Various commercial toys have been created that in one way or another can sense their own state, can be interacted with directly, or, are coupled to a computer [3]. What we view as tabletop tangible interactive toys (including several types of smart toys) are often restricted to interaction on a table or on a small platform. Magerkurth et al. mention various (commercial) smart toys [3]. We will describe such a system as an example of these kinds of toys: Zowie toy has the form of a pirate ship or an enhanced garden that senses the rotation and presence of objects that are linked to interaction on a computer screen. Recently, the combination of games that make use of detected physical objects got a boost with the introduction of Lego Dimensions and Skylanders.Footnote 9 For these games Lego also makes use of popular movies/‘brands’ such as the Simpsons, building upon existing fantasy worlds and introducing these to other types of media, a powerful strategy described as ‘transmedia worlds’ by Henry Jenkins [75]. There are many other tabletop toys and systems, often making use of RFID technology [76,77,78].

There is a variety of commercially available smart building blocks that children can assemble and that are actuated, such as ATOMS, Lego Mindstorms, Makeblock, Cubelets and Moss.Footnote 10 These (robotic) smart block systems seem mainly to focus on the cognitive domain (sometimes dexterity) but less on the other goals we mentioned in the previous section.

There are also affective dolls [3, 79], dolls with screens [80], and commercial dolls such as Furby or Baby Born, which are on the edge of what we called non-interactive electronic games. Furthermore, there are team-based tabletop games with tracked objects [81] or even objects providing haptic feedback [53].

3.1.2 Playground props

Playground props as we view them are similar to tabletop tangibles (and smart toys) but are meant to be used in a larger play-space as part of a room-sized game (or larger). They are often handheld devices with technology embedded for recognition and feedback. For instance, Bekker et al. developed LedBall, a device that can be held in a child’s hand and that responds to movement by emitting different colours of light, either once it is shaken or rolled [1]. This was later called LedTube and resulted in several follow-up concepts.

Similar to such systems there are also interactive bats [82, 83] and interactive art props [62]. Furthermore, other playful objects for children with Profound Intellectual and Multiple Disabilities (PIMD) were created (including a button, a pillow, and a hugbag) [17].

The toy companies (e.g. Hasbro, Mattel, Toys“R”us) also sell commercially available interactive toys which are handheld and do not remain on the table, including an interactive ball [84],Footnote 11 and party toys with sequential instructions and sounds.Footnote 12

Commercial platforms as the Wii make use of accelerometers, infrared, bluetooth, vibration motors, a speaker, and LEDs in their handheld device to trigger whole-body movement. A variety of games have been created for such a platform, including many music related games such as Rock Band, Donkey Konga, and Guitar Hero, and sports related games that rely on arm movements such as boxing, bowling, tennis, yoga, and many more.

Soute and Markopoulos introduced the term Head Up Games (HUG) as a sub-category of playground props where players do not need to focus and turn their head to the devices/mobile screens during an outdoor play activity, which in turn should have positive effects on the social interactions [85, 86]. For instance, Save the safe is a game that is played with a belt with a few LEDs and a vibration motor, where one player has a virtual key that is automatically passed when another player comes close, the burglars need to open a safe with the key in order to win [87]. Several other HUGs with accompanying handheld devices are mentioned/created, where players tag, shoot, collect, or hide someone/something [88,89,90,91]. Others have made use of the LEDs and accelerometer of the Sony Move controller. Johann Sebastian Joust, is a game where the Sony Move controller has to be held still within a certain threshold (depending on the tempo of music playing), players are triggered to physically try to push or unbalance (‘joust’) the other players and be the last one standing. A similar game is Idiots attack the top noodle where a mobile EEG device is added to influence this threshold of allowed movement. Jelly Stomp is a game where players have to submerge another move controller under water.Footnote 13 Several researchers have also created interesting games with these Move Controllers [92, 93].

3.1.3 Wearables

Interactive wearables can also be used as playground props. For example, Bekker and Eggen, as well as Rosales, proposed an idea for an interactive glove [30, 94]. The glove sends and receives an infrared signal as if passing a ball around between players, allowing other players to block or intercept it, a similar glove or wearable display could also be used to play new forms of the game of tag [95, 96]. Rosales et al. created several technologically enhanced wearable systems with which children could play by jumping, freezing and dancing, using shoes, fanny packs, and wearable sound kits [94, 97, 98],

In Jogging over a Distance players wear a headset, and either a waist pouch with a mini-computer and a GPS device [99], or a mobile phone and a heart rate monitor [50], to provide a social joint jogging experience over a distance.

The commercially available game of laser tag could also be partially included in this category, although the guns have to be held in the players’ hands. Recently (2015) Mattel started selling Marvel PlaymationFootnote 14 a mixed-reality wearable toy (an Iron Man glove), where physical movements influence virtual elements and in turn virtual elements influence physical elements.Footnote 15

3.1.4 Semi-portable playground props

Another type of playground props do not need to be carried around, they are instead parts that have to be placed somewhere in the play-space. For instance, De Graaf et al. created the now commercially available SmartGoalsFootnote 16 [23, 84]. Each goal consists of two small traffic cones that can light up when they are in their ON state, and only then, during this lit up phase allow scoring with a ball. The scoring is sensed automatically and the sudden change of a target could make the training more dynamic. The Swinx is a commercial device that is also placed on the ground, where players interact by placing wearable RFID tags. Several researchers used the device to investigate aspects of play including physical activity, collaborative play, and changing game rules [1, 100, 101].Footnote 17

Seitinger et al. created an interactive pathway that was also easily transportable, containing a ladder/rail-track of pressure sensitive pads that each triggered a motor at the side, which in turn made spinners rotate [31]. Even this simple system triggered different kinds of play (fantasy, active, exploration and game building) especially after the spinners were personalised by the children themselves. Various other playgrounds and systems use interactive pressure pads. Lund et al. created one of the first with their modular Playware that included some networking and several LEDs [29]. It was later improved and used for soccer, rehabilitation, and more [34, 102, 103]. De Valk et al. created FlowSteps (later GlowSteps), consisting of a set of even more mobile and battery-powered mats/pads, with different coloured LEDs that are capable of communicating with each other [21, 104]. These systems all provide fun interactions in which players can stomp, jump, and step.

A commercial example of pressure sensitive pads is Nyoyn’s Sound tiles.Footnote 18 Several other pressure sensitive (and portable) pads only function as a means of input but do not include any form of output or have to be combined with VR or other systems.Footnote 19

3.2 Interactive environments

We now turn to the second main line of systems: interactive environments. This includes systems that embed the environment with sensors. It may be that sensors are put into fixed objects, a floor or a wall, or that an entire room can be equipped with sensors. The systems fitting these physical characteristics mainly seem to come in two types: fixed interactive objects and interactive screen environments.

3.2.1 Fixed interactive objects

Many examples of fixed interactive objects come from commercially available playground equipment, see Figs. 2, 3 and 4. Kompan is a company that makes such (interactive) playground equipment, often with a central control station and several flashing game nodes.Footnote 20

A second company that makes interactive playground equipment is Yalp.Footnote 21 Their systems vary quite a bit but include an interactive audio arch, a set of interactive touchscreen poles, and an interactive (soccer) wall.

A third company making interactive playgrounds is Lappset (Yalp its subsidiary). Their GameNetic consists of a terminal that has to be electrically charged using a pedal.Footnote 22 Their SmartUS system was one of the first commercial interactive playgrounds and made use of pressure sensitive tiles, RFID cards and sensors, and several posts with buttons. It also had a control unit for game selection, high scores, and instructions.Footnote 23 It was developed in collaboration with the University of Lapland’s Faculty of Education researchers, Lappset Group Ltd, and IT companies (personal communication 16-3-2017).

Fig. 2
figure 2

Commercial playground equipment. On the left, the Kompan Swirl, the bright red and blue objects represent the nodes, image used from Kompan (fair use). On the right, the Yalp Memo with touch-sensitive LED rings, used with permission

Fig. 3
figure 3

More commercial playground equipment. On the left, the Lappset SmartUs, with the tiles, the poles and the control unit. Photo courtesy: Lappset Group Ltd/Antti Kurola. On the right, a Playtop Street with their design, layout, and surfacing, with a control unit and the LED emitting satellites placed in the ground, still used from Playtop with permission

Fig. 4
figure 4

Even more commercial playground equipment. On the left, the Playdale i\(\cdot \)Play, with activity switches that need to be pulled, pushed, or turned, image used from Playdale (fair use). On the right, the Playworld systems NEOS 360 with the central unit and several buttons in an arena setting, photo used with permission

A fourth company that makes interactive playground equipment is PlayAlive.Footnote 24 Their systems consist of so-called satellites and a control station. Each satellite functions primarily as a button, has a circle of LEDs, somewhat similar to the Kompan Icon button explained earlier. In their PlayAlive Spider the satellites are used to create an interactive climbing frame. Their e-wall solution embeds the satellites into a wall and is intended for educational purposes.Footnote 25 Their satellites are also sold separately, so that others can embed them in their playgrounds.Footnote 26 For instance, satellites can also be embedded in the ground changing the action to stomping instead of pressing,Footnote 27 see Fig. 3. Furthermore, Karoff et al. used it to create an interactive trampoline [107].

A fifth company that makes interactive playground equipment is Playdale.Footnote 28 They created i\(\cdot \)Play consisting of an arch-like structure, see Fig. 4. It has activity switches: buttons, handles, and knobs that include LEDs and speakers.

A sixth company is Playworld®Systems that created NEOS®(360).Footnote 29 NEOS consists of a central unit where games can be selected and that shows a high-score, combined with several poles with large buttons that have to be hit/pressed. The system also plays background music, makes sounds, and is able to emit different coloured lights.

Several research papers also mention fixed interactive objects. The Flash poles concept consist of several poles with 3 coloured rings that could be pushed/turned to change their colour [4]. Ludvigsen et al. created similar poles for training handball [40]. Other systems used a bouncing frame/goal for training handball [39] or soccer [38].Footnote 30 Parés et al. created an interactive water installation [108, 109]. In this installation players had to create a ring of people and then move around a central fountain, to let water jet into the air in predefined sequences. Back et al. created interactive playground landscapes (including a tube and communication node). Both fixed and mobile prototypes were presented but the authors also aim for integration in a specific place [110, 111].

Marshall et al. created Breathless, an interactive swing ride augmenting the awareness of breathing by incorporating it as the control mechanism for swinging, through the use of a gas mask coupled to the motorised swing [112]. Grønbæk et al. created the SwingScape, a set of permanent outdoor located swings that control sonic feedback, augmented with changing lights [113].

Rogers et al. created the Hunting for the Snark, an experience where children have to explore and interact with an augmented environment to get to know more about a fictional character ‘the Snark’ [114]. Children used PDAs to search objects (representing food), placed RFID equipped objects, stepped on pressure sensitive tiles, and flapped their arms in a wearable with gesture recognition in order ‘to fly’ on a large projection.

Liljedahl et al. created DigiWall, an interactive climbing wall [115]. It consist of climbing holds equipped with touch sensitive sensors and LEDs, in combination with a surround sound system. Several games were created for it. Ouchi et al. and Oono et al. also created an interactive climbing wall with similar holds. Their research focused more on modelling the climbing behaviour of the children to inform future designs [116, 117]. Kajastila et al. instead of using interactive holds used computer vision and projections for their Augmented Climbing Wall, which they see as being a part of the larger category of Augmented Feedback (AFB) systems [118, 119]. Wiehr et al. aimed to create a similar but easier to set up system called betaCube [120].Footnote 31

Furthermore, there is a variety of interactive fitness equipment such as adapted home-trainers or treadmills. Because of their adaptations they allow for gamification, or playing certain scenarios (e.g. riding through a city or up a hill). Both kinds of systems are commercially availableFootnote 32 and/or designed in research settings [49]. We will leave further description out of our overview as they often respond only to intensity and not different types of input/body movements, but we do want to mention Heart-Burn as an example of an interesting active game, where people competed by cycling, where adaptive elements were used on basis of both effort (HR) and actual performance to balance the game, in order to increase the experience [121].

3.2.2 Interactive screen environments

Bobick et al. created KidsRoom, the first interactive play system especially tailored for immersing several children in a narrative without them needing to wear any specific hardware [67]. It consists of a room where children are immersed in a linear narrative that progresses depending on the players’ actions and pacing thereof. It has several still-frame back-projected walls (not intended as the centre of the participants’ attention), computer controlled theatrical lighting, and four directionally controlled speakers that play music, sound effects, and recorded voice narration. It contains several different worlds: a bedroom, a forest, a river, and a monster world. Each world includes its own projections on the wall and requires specific actions to let the story progress, this includes recognition of the positions, posture, and movement. The system intelligently exploits and controls the context of a narrative, it requires children to do actions such as shout a magic word, follow the path, walk to a chest, gather on the bed, row a boat (on the bed), and do a dance with a monster.

One other well-known interactive screen environment is the PingPongPlus by Ishii et al. [122]. It consists of a projection on a table tennis surface that responds to the position of a table tennis ball. Ishii et al. created several types of projection modes and games. Altimira et al. recently also created a similar projection-based version for table tennis to investigate balancing a game by inducing an aggressive or defensive player style [123].

Mast and de Vries created a version of cooperative Tetris played on a large screen, where players had to work together to move the blocks [124]. They compared a version where players had to jump (wearing a fanny pack with an accelerometer) to one where players had to press a button.Footnote 33 One player could move a block to the right, another could move it to the left, and an action of both players simultaneously would rotate it.

The Entertaining Archery Experience [37] is similar to a playground props system. It consists of a fairly realistic adapted bow and arrow, adapted with electronics (Arduino with reed switches/sensors, IR-laser and Kinect) and a pneumatic damping system, which has to be aimed at targets on a large screen in the context of a game.

Soler-Adillon and Parés created a large Interactive Slide with an interactive projection on it, where children play games by sliding down over it [24, 125]. Parés et al. created MEDIATE, a large room with two large projection walls and 9 cameras to track behaviour/attitude of the players [126]. The target group was children on the autism spectrum, low functioning and without verbal communication. Watson and Gobeille created Funky Forest, an interactive virtual ecosystem, including floor and wall projections, intended mainly for children, see Fig. 5.Footnote 34

Fig. 5
figure 5

On the left, Funky Forest, an interactive eco system created by Theodore Watson and Emily Gobeille, (photo) courtesy of Design I/O. You can see one person redirect the water, while others are creating trees. On the right, Looking for Life by Snibbe Interactive, an interactive installation representing the evolution theory. You can see two players using gestures to influence and create cells that evolve over time, still used under fair use with permission by Snibbe

Kick Ass Kung-Fu is an interactive martial arts game by Hämäläinen et al. [127]. It is played on a cushioned playfield with two or more large screen(s) at the end, and the movements are tracked in this 5x1 meter area with computer vision techniques.

Mueller et al. created several (distributed) exertion games. They created Remote Impact, where players kick and box against the ‘shadow’ of a distributed opponent projected on a large mattress-like foam [52]. This is held in place with elastic bands that guide the forces which are used to measure where impact takes place. Other systems include break-out-for-two [47], three-way table tennis [48], and airhockey-over-distance [128]. All consisting of a non-interactive floor or table surface with a videoconferencing implementation projected on an interactive vertical wall. In the first two games, virtual areas have to be hit several times (or very hard) before breaking. The last hit will be rewarded with points. The ball will bounce back into the physical world. Instead, in airhockey the players have to hit (and defend) the goal. The puck will be caught, and using rotating cannons the puck will be shot in a similar direction at another location.

Laakso and Laakso created body-driven multi-player games where orientation and players’ group dynamics (e.g. forming a circle) were detected with computer vision [129]. The games were shown on a large wall display accompanied by audio effects, and were interacted with by position in the space and arm gestures in a (forward) horizontal plane. Toprak et al. also created an interactive wall game where two players compete to touch bubbles on a wall [130]. Morrison et al. describe a form of an interactive wall from the domain of interactive art-works: Space of Two Categories by Hanna Haaslahti,Footnote 35 an interactive shadow where an animation of a small girl is projected moving around in the players’ shadow(s) [62, 131].

QuiQui’s Giant Bounce was an early whole-body computer game that made use of both voice input and a web-cam, to recognize children’s movement and actions [132].

ActiveCurtain is an elastic interactive screen that can respond to touch but is different from normal touch screens, created by Larsen et al. for people with PIMD [133]. Using the Kinect’s depth sensor combined with projections behind an elastic screen it can trigger interactions with a different form of bodily engagement. One might use their head or reach into the screen, by responding to such gross body movements and by providing a form of tangible interaction the system seems to be more suitable for people with profound mental and intellectual disabilities. TouchMeDare by van Boerdonk et al. is an elastic touch-sensitive opaque canvas that aims to explicitly elicit bodily interaction between people as a means to get to know each other [134]. It is different from all the other interactive environment play systems as the screen provides no visual feedback but is only aimed at collaborative music making.

Interactive floors Interactive floors have a horizontal area and often have to deal with players obscuring an image/projection for themselves or others. However, space of movement in front of a screen or wall is often more limited, and can lead to confusion in mapping movements to the screen [127].

Several interactive floor systems exist for indoor purposes, with mainly LEDs or projections as means of feedback, and using either RFID [135], pressure sensors [42, 136, 137], a laserscanner [58], Doppler radars [136], and/or computer vision to track people [60, 138, 139]. Several target groups have participated in studies with these floors, including children [140], families [135], students [26, 139, 141], intellectually disabled people [137], rehabilitants [42], and hearing impaired people [138].

Snibbe et al. created several interactive camera projector systems [142]. Boundary Functions created lines between players on the floor, creating a Voronoi diagram. Deep Walls records silhouettes of dancing players in front of a wall. Three drops, allows players to interact with water on three different scales, normal shower like, on a droplet level, and at a molecule level in front of a wall. In their Fear game players can collaborate and simultaneously catch fruits with their shadow shapes, but they have to stand still when a jaguar is looking at them. Snibbe Interactive also created several other interactive installations including Looking for Life, where the evolutionary theory is depicted on an interactive wall, see Fig. 5. Players can influence lightning strikes and with them the cells that slowly evolve over time.Footnote 36

Parés and Parés created Lightpools [143]. Four players are given a lantern that tracks their position, and each player gets a circle projected underneath the lantern. Virtual abstract objects fitting a specific lantern can be found, which can be fed to/grows with the projected circle, and subsequently will move together with a player for some time, in order to be incorporated in a dance. Carreras and Parés also created Connexions, an interactive floor that responds to positions and contours of 8–15 players [61]. The players have to stand on a variety of nodes spread over the floor, each representing a scientific concept. When the concepts surrounding one topic are stood on and players physically link by extending their arms this topic is visualised on the floor, for example, extraterrestrial stone, atmosphere, and trajectory all belong to a meteorite object.

Palmer and Popat created Dancing in the Streets, an interactive projection in a public square [144]. It included flocking butterflies scared by quick movements and attracted by the players otherwise, ghostly feet following the users, and geometric shapes following and linking players in the space. Shadowing by Chomko and Rosier is also an (art) installation that is made part of a street or a square. It is an augmented projection of the silhouettes of earlier passersby.Footnote 37

MagicCarpet by Paradiso et al. is an example of an installaiton without visual feedback, instead it maps user input, using MIDI, into a playful interactive musical environment [136].

An example of an interactive floor environment close to the fixed equipment playgrounds, is Hanging off a Bar. ‘In which players hang off an exercise bar over a virtual river for as long as possible’ [145, p1]. Underneath the player is a pressure sensitive mat with a river projected on it. Occasionally a safe zone in the form of a projected raft gives the player the opportunity to temporarily rest their hands, arms, and legs.

During the last decade many commercial implementations of camera-projection systems have been introduced, see Fig. 6. For instance, Lumo Play and MotionMagix provide a commercial software solution both with over 100 different games that can be bought.Footnote 38 Many of these systems and games do not make use of tracking of players (using both the location and identity), instead in such games it simply suffices to detect movement on locations, for example, scaring fishes or dispersing a pile of virtual leaves. If such a system is also tracking people (position + id), it allows for even more kinds of interactions. For instance, Moreno et al. created the interactive tag playground, an interactive floor projection for research purposes [139]. In the tag game, each player has one circle following them, indicating their role, and children tag each other by letting their circles collide, see Fig. 6.

Fig. 6
figure 6

On the left, children are playing in the Interactive Tag Playground created by van Delden et al. image used with permission of authors [146]. On the right, two children are playing a soccer game on a commercial Lumo Play installation, provided by Lumo Play used with permission

3.3 Geo-location devices

GEO-location devices make use of GPS (sometimes Wifi, Bluetooth, or RFID enabled locating) to respond to being located somewhere. The games played with it, geo-location games, clearly provide a form of interactive play. However, We will not focus on them as they differ slightly from most previous systems as they trigger moving over larger distances, are (ideally/theoretically) not confined to a certain space, nor do they need co-located social interactions, and (most) do not need to be played by people at the same time. Therefore, the following set of systems can be seen as less complete than the previous types of systems. We provide a description of several types of systems that we have encountered in this domain, mainly using the ‘early’ and/or famous examples.

The recent hype around Pokémon Go and its success clearly shows that these games have a large attraction value. One reason for this rise, besides targetting a nostalgic fantasy world [75], is probabaly the now easily available location-specific infrastructure [85].Footnote 39 The games have great attraction value and are successful in getting children to move. However, only the future can show us whether such games are actually suitable enough (for young children). The issue of safety, especially, could become a concern if the games could persuade children to go to unsafe zones.

Vogiazou et al. created CitiTag, a game where a PDA device is used to play a location based version of the traditional game of tag [147]. Björk et al. created Pirates, a mobile game themed around a pirate world, that uses proximity sensors to link visiting physical locations to sailing to and visiting virtual islands [148]. Flintham et al. created Uncle Roy All Around You, a mix between a geo-location game and theater, revolving around the concept of trust [149]. Some players have to find ‘Uncle Roy’ by walking around on the streets of London with handheld computers. Benford et al. also created ‘Can You See Me Now’. This is also a tag-like game where performers/actors are walking around a city with a PDA in order to chase after online (navigating) players [150]. Furthermore, Benford et al. also created Savannah, an educational game for six children at a time about ‘the ecology of the African savannah’ [151]. Rogers et al. created Ambient Wood, a digital augmentation of a woodland, aimed as a learning experience for children carrying out a scientific inquiry [152]. Van Leeuwen et al. created Beagle, an app consisting of a ‘radar’ with which hospitalised children search for bluetooth tokens (Beagles) distributed throughout a hospital [153]. Piekarski and Thomas created ARQuake, which is one of the first examples of an augmented reality game in an outdoor setting [154]. They build on the Quake game in which players have to shoot monsters and can collect objects. Cheok et al. created human Pacman, which uses a similar setup with improved hardware, including a see-through HMD augmenting the physical world with computer graphics [155]. They also added physical interaction with Bluetooth-enabled objects, and even sensing touch of an object or player. A similar game PacManhattan, was created by NYU students but was less technologically enhanced. Players had no HMD and had to update their own whereabouts at each street corner.Footnote 40

3.4 Addressed modalities

So far, this section presented an overview according to the physical characteristics. We now also provide an overview of these systems (in cited papers) looking at which modalities were used for feedback, and in what combinations. This overview also includes a limited number of available commercial systems that were used in the research activities such as the Wii, Kinect, and Donkey Konga. We omit duplicates of systems. That is, often identical systems are used for different types of studies and in the table we include only the first paper encountered with that version of the system (n=18).

Furthermore, review style papers that do not need to clearly describe feedback modalities (e.g. [71]), often reporting on more than 6 devices, were also omitted from the overview (\(\hbox {n}\approx 13\), e.g. [3]). Many papers cited in this manuscript are used for underlying theory and do not include a clear description of systems (\(\hbox {n}\approx 62\)Footnote 41). Five papers with additional systems related to new forms of CAPs (not included in Sect. 3) were explicitly added during the review process to exemplify modalities [63, 64], future directions [74], and addressing theory [156, 157]. The resulting overview is presented in Table 1, where the numeral indicators help to recognise what combinations were addressed.

Table 1 The summed occurrences of modality combinations, based on 158 included systems

For Table 1 we have used a pragmatic subdivision of modalities into 15 categories suitable for our purposes. We used the use cases as reported in the referenced papers and as interpreted by the first author of this survey. For the visual sense we made a subdivision into displays (including LED displays), projections, LEDs, (spot) lights, and movement of objects. Regarding the auditory sense we divided that into the categories: music, voice, and sounds. We noticed that it was hard to recognise whether only sounds were made or actual music was created. We omit further subdivision in localised sounds with a (virtual) point of origin (e.g. [99]), directed sounds (audible in small area), or type of sounds.

With regards to the haptic modality (or the somatosensory modality, a term with similar meaning), based on personal communication with two colleagues working on haptics, we included 4 subdimensions: tactile, kinestehtic/proprioceptive, thermoception, and nociception. Another subset triggering pruritoceptors related to ‘itches’ might be considered as an additional subset of nociception [158]. Although sufficing for the perspective of what is targeted, there can be some debate whether these are all to be seen as fitting the haptic container.

The olfactory sense, the gustatory sense, and the sense of balance (equilibrioception) as well as the subdimensions of thermoception and nociception were included during categorisation but later removed from the table. The few systems targeting these are reported on individual basis in the text.

Possibly some systems include modalities that were not reported clearly. Furthermore, all systems, besides these enhanced forms of feedback containing certain dynamics, can also contain more static and inherent visual, tactile, auditory, olfactory, and taste characteristics. For example, a responsive moving styrofoam ball (visual-movement) that makes certain sounds based on user-input [18] also has a certain taste, a certain colour, and might have a certain smell. Elements like these are not responsive to input, (probably) not targeted specifically by designers, and therefore not considered as a targeted modality.

3.4.1 Observations on addressed output modalities

Table 1 is made based on 158 systems and it is noticeable that many systems provide feedback over several modalities (\(\hbox {n}=106\)).

Many papers are unclear about what their system actually does with respect to the type of feedback provided: for example, is it a projection screen or a bright TV screen, what type of audio is played, and what physical shapes and materials are included?

Several iteration of systems are reported on, where modalities were added for such a iteration (e.g. sounds in FeetUp system [22] but not yet in [97], similarly in [21, 58, 59, 104], and [25, 26],Footnote 42 also haptics as simulated reaction force appeared in the Airkanoid system [82] but not yet in [83]. In one case this was the other way around where sound was first removed [29, 34, 102] and then in modular fashion added again [103].

The interactive toys kind of devices often focus on providing feedback with integrated LEDs and Piezo speakers, providing sounds, and in some cases short pieces of music and voice commands. This fits well with the idea of Head Up Games [85], where there is a focus on providing ‘Imagination’ over ‘Visualization of virtual worlds’.

Furthermore, it is interesting to note that the more uncommon modalities are, not suprisingly, not often targeted. The haptic thermoception category is only indirectly influenced by adjustable blowers in the Entertaining Archery Experience [37]. A feeling of tickling, fitting the haptic nociception (pain) category, is reported to be stimulated by visual input in [59]. No active influence of olfactory category is mentioned in the papers used for this survey, it mentioned at least once as a passive element taken into account [112]. The gustatory sense is only influenced in one system [64], with the use of cross-modal effects of visualisations. Two systems actively influence the equilibrioception (balance) category, the first does not actively state the aim for this but uses a treadmill that changes incline [49] and a swing is used in [112]. This brings forward already one of the examples going beyond screens and sounds that includes most modalities in different ways: the gas mask controlled swing by Marshall et al. [112]. In their project they include spotlights, a display, a moving swing (equilibrioception, kinesthetic/proprioceptive), voice communication, and although passively, explicitly consider modalities such as the heat, the tangible feeling, and rubbery smell of the mask. Only Van Boerdonk et al. and Larsen explicitly targeted a non-visual experience based on the context, target group, and interaction [17, 134], in three other papers there is also no explicit visual feedback [41, 95, 136].

3.4.2 Input modalities

Based on the table with examples of human sensory modalities for mulitmodal interfaces (e.g. visual: face location, gaze, facial expression etc.) provided by Turk [159], and the examples of multimodal types of input by Oviatt [160] (i.e. speech, pen, touch, manual gestures, gaze, and head and body movements), we recognise that systems in the large majority of CAPs used location & gesture/body movement, and pressure/touch. In other words, providing movement, impact, and pressing as user abilities. There are a few exceptions that did use microphones as part of a communication device [74, 99, 110, 150], in order to detect a scream [67, 132], or to recognise chewing [63]. Some other examples beyond touch and body movement input encountered in CAPs, is the use of heart rate [90], or the flow of breathing [112].

In Sect. 5.3.4 we will explain that the implemented output modalities can have an effect on what kind of play the systems trigger [1], and in Sect. 6.3 we point to opportunities for research addressing more modalities, also for the input.

4 Evaluation techniques and methods

We have seen that there are many different systems. There are also many different ways to evaluate these systems. Evaluating interactive play systems that are controlled by moving the body is often not a straightforward task [132]. It regularly involves evaluation of interactive games/systems with children, which is a topic for a text book [161], a thesis [162], a paper [163], or at least an influential column in a journal on its own [164]. Furthermore, (open-ended) play interactions do not focus on efficient interactions [62], and instead focus on (user) experience. Several more traditional HCI evaluation approaches with certain questionnaires and measures will therefore not be applicable. This section includes a description of several methods and techniques, where applicable first referring to (a more extensive) research without CAPs before mentioning the application within a CAPs context. Many of these methods can be considered as ‘the basics’ that many readers already know. However, we think the descriptions showing how and where they are applied are suitable as an introduction (e.g. for starting students), as it helps to provide an overview of how this field often works. It also provides an overview of several relevant measures for evaluating CAPs. Therefore, we specifically see an added value of this section in showing the application and implementation of these evaluation methods and techniques as used in CAPs.

The experiment design is also a very important part of the evaluation. Depending on the context and extent of a learning effect, in some cases turning to a within-subject design in combination with a Latin square (controlled order) could help to appropriately deal with person-to-person differences [19, 39, 162]. However, a thorough description is outside the scope of this survey and we refer the reader to [165] for an old but comprehensive overview of (quasi-)experimental designs for educational purposes and the accompanying shortcomings and benefits regarding internal and external validity. Below we will mention the evaluation techniques and methods we have encountered that have successfully been used for the interactive play context once a proper experiment design is chosen.

4.1 Discussions and notation of utterances

A first technique for evaluation is simply listening to what people have to say during and after their play activity. It can be an important source for information during evaluations. Various techniques have been developed to stimulate people to verbalize what they experience(d). Often quotes are used as examples to describe how people experienced a method [91] or design [98].

4.1.1 Thinking aloud

Thinking-aloud protocols are often used in evaluations with adults to gain more insight towards understanding what the user is thinking. They have been applied in evaluation research with children as well, although it might be unsuitable for analysing actions [78]. There is often a difference between the original strict guidelines/literature and practice, where in practice researchers do not keep to constant prompting every 15–60s or use different prompts than a neutral simple prompt (Mm hm?) [166]. When dealing with children such changes might become a deliberate choice in (future) techniques, as it can become distracting and forcing if one does need to keep on prompting non-talkative children [78].

4.1.2 Picture cards

Barendregt showed that for children combining thinking aloud with Problem Identification Picture Cards (PIPC) that depict frequently occurring problems can be a suitable aid to remind children what is of interest to the researcher [162].Footnote 43 The cards were beneficial for the number of problems indicated and were preferred by children as well. Other cards with pictures can also be used to structure an interview with children and help to keep children focused during a (semi-) structured interview [91].

4.1.3 (Semi-)structured interviews

While a structured interview always follows the same questions in the same order a semi-structured interview leaves room to jump to a related question based on a response, whether this question was already planned for later or not. Depending on the target group and context, the duration of an interview is often kept short, especially when an informal interview is done at the point that players are about to leave after playing for some time [98], while after more extensive planned tests it can take up to several hours [50]. Group discussions/interviews can also be done with multiple (child) players after a session [87, 90].

Similar to remarks made during the tests, quotes of people can be a convincing way to show how something was perceived by the players, for example, regarding the use and experience of the Breathless entertainment system—the gas mask swing—‘P2: it’s a bit you feel like oh no I don’t want to go now... but by the end you changed your mind’ [112, p132]. It is good practice to record and subsequently transcribe interviews when doing a thorough analysis, although at times it can suffice to only take notes during the interview in order to save time or to adapt to a certain context.

Emergent coding One advantage of transcribing interviews is that it will make it easier to quickly scan through and will also help when looking for recurring elements/themes. This is a first step in grounded theory, where researchers analyse their data, look for recurring elements and when and how these elements/concepts do (or do not) differ, and from there slowly build towards new theories. Such a theory is ‘descriptive rather than predictive’ [167, p642]. Such a method was used, for instance, in analyses of joggin-over-a-distance with regard to social experience to describe themes that could help to build guidelines [50]. A similar coding process was used to describe dominant themes in an interactive sport skill training with the Bouncer system [39]. This categorising process of analysing behaviour can also be done based on (video) observations (perhaps focusing more on interaction types) or answers on open-ended questions in questionnaires.

4.2 Questionnaires

Most questionnaires make use of Likert scales, consisting of statements that the participant agrees or disagrees with (often on a scale of 4,5, or 7). Several statements belong to one construct, and multiple constructs can be used to investigate a certain topic of interest, for example, the perceived presence of other players. Instead of Likert scales a type of semantic differential scale can also be used, where opposite verbal anchors are at the ends [168]. The questions (or agreement with statements) measuring one construct should be answered with approximately the same scores by one person showing that indeed one construct is measured. This can be expressed with the Cronbach’s alpha. The most well-known example of such a validated questionnaire is probably the big five inventory regarding personality traits [169]. Such personal traits can influence results and is often used to explore interesting links between personality and the experience or use of a system.

Another predictor can be the tendency to get immersed in an interaction, and it could be helpful to apply a version—revised by Berthouze et al.—of the Immersive Tendency Questionnaire ((G)ITQ) (based on Witmer and Singer’s work [170]) before the interaction starts [171].

4.2.1 Game experience questionnaires (GEQ)

There are various validated questionnaires on topics regarding the perceived experience with the system that are often researched in the context of interactive play. These tend to have a sound theoretical grounding. There are questionnaires such as the Game Experience Questionnaire (GEQ) by IJsselsteijn et al. [172] that have been meticulously developed and can be applied easily although they are awaiting official validation [37, 172,173,174,175].Footnote 44

Poels et al. also created an adapted version of the GEQ for children, which has been applied in several studies in adapted form (reduced/extended) [100, 179,180,181], and seems not to have been validated yet, which limits the extent it is used in (analyses of) the results [180].Footnote 45

Questionnaires are also applied to look separately, and in more detail, into dimensions that are also part of the GEQ, such as (social) presence of other players [53, 62, 141], for example, with the Networked Minds Measure [183] or more regarding closeness [134, 141], for example, with the Inclusion of Other in the Self scale (IOS) [184], aspects of Flow [62], and (sensory and imaginative) immersion [185], for example, with the Immersion Questionnaire by Jenett et al. [167].

4.2.2 Fun toolkit

Read and MacFarlane describe the use of their Fun Toolkit, and other survey methods with regard to evaluations with children [163]. They explain the use and disadvantage of several tools. It includes the use of a ‘Visual Analogue Scale (VAS), a pictorial representation that children use to identify their feelings or opinions [186, p83]. The Smileyometer can be applied [84] with such pictures creating ‘a discrete Likert type scale’ which were intended to be used before and after the experienceFootnote 46 [186]. The Fun Sorter technique can be applied [90] to let children rank icons representing the items of interest on one or more constructs [186]. The Again-Again table can be applied [181] for one or more activities giving a reasonable measure for fun; in this table children answer whether they want to do the activity again, choosing between yes/maybe/no [186]. The latter might also be adapted and applied to indicate what version they want to play [87].

4.2.3 Other questionnaires

Other applied questionnaires that looked into aspects of intrinsic motivation [62, 134], for example, the Intrinsic Motivation Inventory (IMI) related to Deci & Ryan’s Self Determination Theory [187],Footnote 47 into perceived exertion [185], for example, Borg’s Rate of Physical Exertion (RPE) [189, 190], or hedonic/pragmatic qualities [37], for example, using the aspects identified by Hassenzahl et al. [168]. The Self-Assessment Manikin (SAM) can be applied [134, 185] to measure human affective responses using a non-verbal pictorial assessment technique regarding pleasure, arousal, and dominanceFootnote 48 [191].

4.2.4 Open and less structured questions

Questionnaires can also leave room for less structured open comments, which—similar to noting down the utterances—can be a convincing way to represent the players experience. Open questions, sometimes combined with one or more scores (e.g. related to the ‘enjoyment’ of the experience), can also function as input for, or rationale for continuing a next iteration of a prototype or product [99, 192]. Even letting children draw might be an informal fun way to engage children in discussions [78]. Subsequently coding the answers to open questions into categories often provides an insightful way to present the results [119].

4.3 Observations/video analysis

Video analysis is a method/tool often used in evaluation with children [1]. Druin et al. do mention that (in the old days) recording children was sub-optimal, as (sound) quality was mediocre and recording could also influence the behaviour as they tended to ‘perform’ in front of the camera [193].Footnote 49 Other studies did not observe such a change in behaviour and did successfully use video analysis, perhaps due to incorporating different strategies of evaluation protocol, or perhaps due to habituation to the recording technology [132, 152]. In some cases first person view/head mounted cameras are applied [90], in others multiple cameras are needed to cover the span of the playing field [87], in order to follow several children at the same time [151], or to look from different angles [39, 84]. Making a trade-off between in-depth analysis and efficiency, to shorten the time consuming process of video analysis one could also make a pre-selection and or shorten/edit the video clips to be analysed [98].

4.3.1 Peer-tutoring

Höysniemi et al. proposed to use peer tutoring in the evaluation of interactive system for children [132]. In their peer-tutoring method a child learns the game and then instructs another child later on. These interactions are recorded and the analysis of their explanations can show what kind of problems occur, or what elements are unclear. They applied this method in their analysis of QuiQui’s Giant Bounce and subsequently changed some controls accordingly. These changes were in turn tested and showed that children needed (significantly) less time to perform the actions/controls. Verhaegh et al. used the method to decide between/evaluate two interaction styles regarding their Camelot game [91]. Avontuur et al. adapted such a method to a group based interaction for the evaluation of their BuzzTag game, although this became quite chaotic [88].

4.3.2 Annotation

Annotation schemes can be used to structure video analysis. These schemes consist of several constructs or dimensions of interest to the researcher, for example, physical activity levels during play. Several raters rate the behaviour of a person according to what extent or what type of behaviour they see a person performing, for example, sedentary behaviour. Either what kind of behaviour is seen during an interval, or at a specific moment in time (momentary-time sampling). Both are done with a specific time interval in mind that fits the behaviour to be annotated. We refer the reader to [194] for a more detailed explanation, discussion on confusion, using annotation schemes, and indicating inter-observer reliability in a HCI context. Some simpler forms can also be used where several observers rate behaviour over the entire sequence on a scale, for example, the amount of movement 1–7 during a game of Guitar Hero [19].

Several observation schemes exist related to (interactive) play regarding social interaction and physical activity, such as the POS [195], the OPOS [196], MIPO focusing on social functioning [197], the Social Play Continuum,Footnote 50 OSMOS focusing on motor skills [198], and often such schemes are adapted and then applied for interactive play evaluation purposes [15, 98, 100, 125, 199]. One frequently used measure for inter-observer reliability is Cohen’s \(\kappa \). Some aspects of play such as engagement or social interaction seem hard to quantify with video observations, especially for children [87, 91].

Real-time observation is sometimes performed as well but is often too hard to perform reliably for interactive play evaluations, especially if one wishes to follow all children individually.

4.3.3 Occurrence of behavioural cues

Video analyses or direct observations without pre-defined annotation schemes can also (but in a more qualitative manner) provide information on what kind of behaviour occurs. Such findings are often not seen as thorough proof but either show that the evaluation held to the theory or show fruitful directions for future research on such theories. For instance, in a study with the Wearable Sounds Kit (WSK), besides more thorough quantitative analyses with a pre-defined annotation scheme, Rosales et al. compared the type of movements, fantasies, and explorative efforts between boys and girls. Their observations indicated more gun-play from boys, and more play related to birds and bubbles and a longer exploration phase for girls [98]. Morrison et al. used the descriptions of their observations to show that different types of play (related to those they found in literature) occurred in their open-ended interactive art works [62]. Bekker and Sturm used video observations to count the number and (count and categorize) the type of games played with the ColorFlare [179]. Back et al. used coding observations in a qualitative way, where they looked at play types and locations [111]

4.4 Automatic measurements

One of the measurements related to fun is the time participants spend on an activity out of own volition. Commercial platforms from both Kompan and Yalp include web interfaces that can be used to see how often their equipment is used and which games are played. It seems this would also allow for long-term testing in real-life settings. Various systems can also make use of logs of the system regarding interactions speeding up the evaluation process, for example, time played [119, 145] or use [119, 153], the performed movements/actions[200], and positions of players [25, 26].

4.4.1 Activity

Another dimension that is interesting for evaluation is a measure for the level of activity during play. In some studies Heart Rate (HR) sensors have been used to this end [24, 68, 121].Footnote 51 HR provides an indication of physical effort. In order to estimate a percentage with regard to effort they do need to be related to age, a personal optimum, or recorded maximum heart rate.

Another way to measure movement is to use Computer Vision. The amount of movement can be tracked based on recordings using simple methods such as Motion Energy Analysis which essentially subtracts subsequent video frames from each other and sums the pixels that have been ‘moved’ [201]. Such a method has been used to show the amount of movement of groups of players in an interactive playground [24]. Instead of using the information based on all players, computer vision also allows researchers to track the movement and position of each individual player which can be used for more detailed evaluation purposes as well [25].

A Motion Capture suit can also be used to track the activity including the type of movement of the players automatically. For instance, allowing analysis of personality and the type of movements players make during whole body gaming with interactive play systems [185] or between different conditions [19]. Some systems make use of multiple cameras, computer vision software, and infrared reflectors,Footnote 52 others use inertial gyroscopic technology.Footnote 53

Handheld devices could use their GPS data [50], or accelerometers to indicate amount of movement which is an aspect of activity, or use the GPS data and logs to analyse technology performance and players’ actions [150].

4.4.2 Physiological measures of affect

Besides activity, arousal can also be an interesting feature to measure. Galvanic Skin Reponse (GSR) measures conductivity related to sweat ‘production’, and is used as a means to measure arousal [202, 203] (sometimes even used as in-game element (e.g. to indicate bluffing) instead of being an evaluation tool [81]). Mandryk and Inkpen used GSR measurements to evaluate game play of a traditional controlled video game: NHL 2003 [203]. They also combined this with electrocardiography (ECG which measures heart rate related parameters), respiratory measures (increased respiration also indicates heightened arousal) and electromyography (EMG, to measure muscle activity which applied on the face can be related to positive/negative emotions or tension, frustration, or concentration levels). Although these biometrics seem to be highly objective, in our experience interpretation is not always as straightforward or objective as it seems. Properly recording skin conductance can also be an issue in combination with energetic movement [204]. Yannakakis et al. also used such biosignals (skin reductance, blood volume pulse and heart rate) in their interactive playware research (evaluations) to link them to entertainment preferences, which after machine learning could estimate/model/account for about 80% correctly [204].

5 Type of research contributions

The argumentation that motivates the interest in research on interactive play, the end-user perspective (the higher end goals described for argumentation), is different from what is targeted as a research outcome of a(n) (individual) study or paper, the research perspective. In this section we explain some of the contributions as we have seen them in research papers regarding CAPs. These contributions answer questions such as: What does this study show us? What can others learn from our research efforts? How can others apply the gathered insights?

In the broader perspective of HCI we see a spectrum of research on interaction that can focus on creating theory and/or informing design practice, as shown by Dalsgaard and Dindler [156], (building on) Höök and Löwgren [157], and (building on) Stolterman and Wiberg [205]. Both ends of such a spectrum still primarily target what we see as a research perspective: what researchers and HCI-practitioners can learn from (doing) the research.

Stolterman and Wiberg (besides this spectrum in HCI between guideline approaches and ‘imported’ grand theories) also point out a difference between concept-driven versus situation-driven research [205]. Where situation-driven research has ‘as a primary goal to create a (concept) design that would support the use situation’ [205, p100]. Instead concept-driven research targets the creation of ‘concept designs and has as the primary goal supporting theoretical development’. This is related to whether the end-user perspective or the research-perspective is the starting point, whether the intention is to create a desired situation or to improve theoretical concepts. We did not look specifically at such a distinction in research for CAPs, instead we mention three categories of research contributions that we currently recognised.

First, we address research contributions in the type of knowledge of what Höök and Löwgren describe as ‘intermediate level knowledge’, between instances and more elaborated theory [157]. Mentioned examples of this type of knowledge in the broader HCI research are methods and tools, guidelines, patterns, heuristics, criticism, experiential qualities, and annotated portfolios, and Höök and Löwgren add strong concepts to this. Although other examples of such intermediate knowledge might be present in the selected papers, we address guidelines, frameworks, and a bit on the design process. These three, in combination with the mentioned evaluation, show the necessary breadth in being closer to application or to theory, and to process or product.

Second, we mention examples of research with evaluations investigating a system’s fit for its (designated) purpose within a certain context, inherently related to a situation-driven approach.

Third, we show examples of investigating certain design elements. This kind of research answered questions like ‘If a system contains element X does this help to satisfy goal Y?’. This is often related to the generation or verification of guidelines, another way to find potential pre-patterns (cf. [206]), or a small part of showing fit for its designated purpose.

Note that this omits critical design, where the design research for one focuses on perspective changing [207]. Although this is likely to also have added value for CAPs, especially for those with a strong artistic statement or background, based on our selected papers we did not include it. More importantly, this section does not contain the notion of design concepts as research output: ‘manifestations of a more general theoretical notion in a more concrete design, focusing on overall organizing principles of the design as a whole and generally aimed at portraying future designs’ [157, p23:5]. In a similar fashion Dalsgaard and Dindler, as part of bridging concepts, mention that design exemplars can ‘embody the properties of the concept, reflecting the span from theory and practice’ [156, p1636] Related to a concept-driven approach concepts can function ‘as a design composition, bringing together technological advancements with functionality and focusing strongly on use, while inspiring theoretical and conceptual development in the field’ [205, p106]. The research through design in HCI Zimmerman et al. model ‘emphasizes the production of artifacts as vehicles for embodying what ‘ought to be’ and that influence both the research and practice communities [..] artifacts provide the catalyst and subject matter for discourse in the community’ [206, p498]. These formulations together show that designed artefacts in HCI in general can pave the way for new directions of interactions and even research discourse. We use this to emphasize the importance of the concepts in the form of interactive artefacts as part of research output for CAPs, and we have already mentioned and shown a broad range of interactive play systems that might function as inspiration.

5.1 Structuring the design process by sharing challenges and experience

To aid in the design of interactive play systems many researchers share their insights in the form of guidelines, frameworks, taxonomies, or lenses.

5.1.1 Guidelines and lenses for the design process

Several guidelines and methods for designing and evaluating embodied interactive play systems have been introduced, we refer the reader to [208,209,210] for a few of the most comprehensive sets of guidelines related to exertion games. Furthermore, instructions, guidelines and lenses provided for game design are certainly worthwhile considering during the design process of CAPs. Although they have not been based explicitly on interactive play, we refer the reader to [75, 211,212,213].Footnote 54

Soute and Markopolous propose to merge the aspects of traditional outdoor play with computer games and for their HUGs mention that technology should be simple, easy to bring along, trigger imagination (instead of unambiguous visualization), and trigger social interactions [85]. Bekker and Sturm examined how successful non-interactive play objects can be translated into open-ended play objects [179]. Building on this, Tetteroo et al. proposed a method to design interactive playgrounds in a systematic manner based on dimensions seen in traditional playground games [15]. Konkel et al. had built on the games memory, tag, and hide-and-seek for their Tagaboo system [95]. Similarly, Moreno et al. as well as Rosales explained how they designed their interactive systems based on observations of traditional play sessions and games [94, 139]. We have seen that commercial systems also build on the power of ‘traditional’ games such as memory, tag, and freeze dance (stopping when music ends), making music, or playing sports such as soccer. However, Soute and Markopolous also remark that it seems important early on in the design process to realize what the benefits might be of technologically enhancing traditional play (i.e. random allocation of teams, hidden actions, balancing etc.) [86].

De Valk et al. proposed a model to design for open-ended play [214]. Tiemstra et al. also provided a set of guidelines regarding the design of open-ended play systems based on their experience and observations of interaction with the SmartGoals [84]. Bekker et al. also included this open-endedness to use as one of four different lenses for the design of interactive play systems: (1) open-ended play, (2) forms of play, (3) stages (phases) of play, and (4) playful experiences [215].

Wyeth et al. created guidelines and urged developers and researchers to address fulfilling psychological needs with the design of whole body interaction for people with intellectual disabilities [216].

Furthermore, many papers only mention a few lines about the rationale behind certain design choices which could be seen as guidelines as well. For instance, using a fairly abstract shape to prevent a focus on the aesthetics (and prevent games depending on it), and instead letting the children focus on types of feedback but still making the interaction possibilities clear [180]. Another example is the rationale of Ishii et al. behind creating a variety of modes for PingPongPlus, that were chosen to span two identified dimensions: competition-collaboration and augmentation-transformation [122].

5.1.2 Frameworks

Frameworks can be developed, in order to understand new research directions and map out the opportunities and issues. These are the result of thorough analyses of a certain topic with design cases and are often related to or based on psychological models and other existing theories. In contrast to guidelines they do not describe straightforward rules on how something should or should not be designed, instead they provide perspectives: focus on what could or should be investigated or designed, how elements relate to each other, or in what way a system can be described. Mueller et al. described their sports framework as a design vocabulary, a tool for discussions and setting goals and aspirations, and ‘as a way to think and talk about it’ [217]. Examples of (preliminary/simple) frameworks are the Tangible Interaction Framework by Wyeth et al. which relates design of playful tangibles (e.g. Wii) to engagement, specifying a dimension of representation and control [218], a framework for evaluation of persuasion in games [219], or a framework for developing playful persuasion systems linking four levels of a design: transformation (the intention, to let a player jump), experience (triggering a need, for self-expression), interaction (jump triggers sounds), and system (a musical staircase) [220]. The ‘sensitising terms’ of Morrison et al. can also be seen as a framework of open-ended interactive art installations that require whole-body interactions [221]. Carreras and Parés created a framework for a similar topic, designing full-body interactive experiences [61].

5.1.3 Design process

Another type of contribution is to add to the design process, mentioning certain techniques applied in a certain step of the design process and their applicability, sometimes for that particular project. For instance, the authors of the Entertaining Archery Experience, before identifying guidelines and hints for best-practice, mention how they applied known methods [37], similarly Brederode et al. describe the applied design process for their pOwerball [222].

An important aspect in the design process of interactive systems (for children) is participatory design, in which the end-users are part of development throughout the entire process, benefit directly, and get a fair say in the design directions. Many research contributions regarding interactive play and development (and evaluation) of interactive systems for children concern guidelines and techniques for such an approach. Several guidelines and techniques for doing participatory design have been proposed and adapted to use with children, such as contextual inquiry with additional note-takers, self-reflection discussions on behaviour and preliminary research findings, technology immersion, and guidelines regarding group composition/age [193].

Related to extensive participatory design practices, is the somewhat limited but still iterative involvement of children, for example see [222]. One could also start with analyses of children’s play behaviour [25]. Which can be followed by an iterative process of testing (low-fi) prototypes to make decisions regarding the design [125].

5.2 Fit for purpose in a certain context

Besides showing how to design through examples of the design cases it is often good practice to show whether the design suits the context of use. Senda mentioned a broad distinction of four physical categories of contexts where children could play: (1) streets, (2) parks, (3) schools and education facilities including museums and libraries, (4) public spaces [27]. Many of the papers and systems mentioned do indeed target one of these settings. The suitability in this physical context for the intended target group forms an important factor to show that a system is fit for its (designated) purpose. Some systems aim at an older target group in a context of art galleries [221], exhibitions, and trade fairs [37]. Games such airhockey over a distance are envisioned to be more appropriate for (employee) gathering areas (canteens, reception areas), arcades, airports, youth clubs, and children’s hospitals [128].

These examples show that a wide variety of contexts can be targeted and some authors show tests regarding the applicability in their envisioned environment, or they do their actual tests in the appropriate context to make their results applicable to such a realistic context. For instance, this was exemplified with the high throughput of people interacting with a water fountain, tested at the ‘Universal Forum of Cultures’ [108]. Morrison et al. did most of their investigations during art exhibitions [62]. Kajastila et al. placed their Augmented Climbing Wall in a commercial climbing centre [119]. Mast et al. [124] and van Boerdonk et al. [134] performed their user studies with respectively cooperative Tetris and TouchMeDare during a large music festival.Footnote 55 Lund et al. showed qualitative and prelimanry results with a pilot for home rehabilitation with their interactive Playware tiles [34]. Van Delden et al. placed their playground in an art-gallery for several months, where children enjoyed playing in it and came back to play with it again [146]. Hof et al. did their testing in an after-school care centre to deal with the influence of the environment and to provide known physical objects stimulating creativity [181]. To address the fairly specific context of disabled people, Larsen recorded numerous interaction sessions of this target group and their caregivers with his interactive play systems in the real-life setting of a care centre [17]. Van Delden et al. tested their personalised interactive gait rehabilitation games with therapists and rehabilitants during actual sessions at a rehabilitation centre [42].

5.3 Showing effect of design(-elements)

Besides the focus on structuring the design process, and investigating the fit for purpose in context, we have also seen a large amount of research into interactive play that focused on investigating certain design elements. We cannot report all of these influences but they do give a good impression of this type of contribution and to this end we highlight some that had impact on our own work as examples.

5.3.1 Embodiment versus traditional controller

Requiring the involvement of body movement can have a significant effect on players’ experience [209]. For instance, Berthouze et al. showed significantly higher engagement when comparing Guitar Hero played with a guitar to a DualShock controller [19]. A significant positive effect on engagement and on movement was also found when players played with the guitar including the performance-like ‘star power’ movement (heavily tilting it) compared to playing without it. Furthermore, they indicate that such a fantasy rich game element involves/triggers a different type of engagement than the hard fun/desire to win. Their results also indicate a significant effect on affect where playing with a guitar seemed to result in more high-valence/high-arousal. Similarly, comparing playing two-player Donkey Konga with a bongo controller, to playing with a GameCube controller, resulted in higher engagement and more social interaction: more utterances, more instrumental gestures, and more emphatic gestures.

Exertion games compared to non-embodied interaction styles can also change competition [209] and can have a positive effect on connectedness/bonding and perceived video-conferencing quality [47].

Beelen et al. showed in a tug of war game that adding haptic feedback of the other player instead of a constant force added to the social presence of the other players [53].

5.3.2 Open-ended versus predefined games

Bekker et al. compared open-ended play to a pre-defined game with their LEDtubes but they found no significant effect on perceived social interactions (talking and collaboration), although the children did appreciate the open-ended version more [1]. Furthermore, provided with an open-ended play system, children will turn to their creativity and create various games, and once the device had added functionality (creating the ColorFlare) it appeared that more diverse games were created. A blend between the two can also be created where players are able to change the rules of the game in the system itself during play [88, 101, 223].

5.3.3 Shared object or a personal object in order to encourage social interaction

Based on their analyses of traditional playground games Tetteroo et al. state that shared/individual items can lead to in-game ‘status’ and as such could stimulate social interactions [15]. Rosales et al. debate whether this is true as they were more successful with an individual object, and also managed to trigger social interactions [22]. With their Swinxbee games Jansen and Bekker more convincingly showed by comparison that in their case shared objects in a collaborative setting did indeed stimulate forms of social interaction [100]. Nonetheless, one game with such an object that also triggered intense physical activity actually had a diminishing effect on social activity. They concluded that stimulating creativity and mimicking could also have a positive effect on the amount of social interaction without the introduction of shared objects. Following their conclusions, it seems that interactive play could either focus on (1) a fast paced competitive game/stimulating physical activity, or (2) stimulating creativity and social interactions [100]. Such decisions and goals also influence other choices, such as the role assigned to an artificial referee [223].

5.3.4 Multimodal output

Bekker et al. investigated their Multimodal Mixer to see what the impact was of adding modalities with sound and vibration feedback, compared to only providing light [180]. The number of games that were played in a session did not seem to differ much. The reported experience (enjoyment, fantasy, game creation abilities) did not seem to change either. The type of interactions with the device did change, where the richer feedback led to implementing more of the functionality into the games and triggering a wider range of input, although only the output modality changed [1]. Furthermore, it seems that available modality can also influence the type of games played, for example, vibration allows for secretive games, where visual cues trigger games like tag in which the devices also have to be looked at [1]. In a similar fashion Jensen et al. with their Football lab play-space for soccer training noticed that using sound cues can make players focus too much on the ball, instead colour coded LEDs on different places could trigger quick scanning behaviour that is important for football [38]. Both examples show that considering which modalities to target should fit the kind of game made for the system, where the underlying goal of the installation should be considered for input.

Another obvious but important factor when considering mulitmodal output is the link to the target group and accesibility standards. Altamimi and Skinner cite three studies in which exergames are created for people with visual impairments, where interaction relies on tactile and auditory sensation, instead of being vision heavily [70]. Even colour blindness can be important factor when indicating a certain role with a certain colour [224]. Addressing people with hearing impairment also requires different versions of existing feedback, as was the case for colour blindness, such as intergration with a cochlear implant that produces a sense of sound instead of simply playing sounds out loud [138].

Regarding addressing multimodal output is also good to consider so-called cross-modal integration of perception. Most well known are examples during food intake, for instance, where (dynamic) augmented visual cues [225] or an additional smell can influence the flavour experience [65, 225]. These types of cross-modal integration also go beyond food intake [159, 225]. Together, such examples form a good reminder that, when feedback is not uni-modal but provided in a multimodal fashion congruently and simultaneously, they can form a synergistic effect strengthening hedonistic experiences.

6 Towards intervention based play research

In this survey we have seen two perspectives. There are many novel and exciting systems for interactive play. These often originate from a user perspective: we want to achieve something new and worthwhile for the user. These ‘argumentations for play’ are discussed in Sect. 2. Subsequently, studies are carried out with the resulting systems, from a research perspective, as discussed in Sect. 5.

When these two perspectives are well integrated, the user studies are well controlled and are meant to show effects that support, or contribute to, the aim and justification of the system from the user perspective, this yields what we call intervention based play research. In this section we will discuss some characteristics of examples of successful intervention based play research in our survey. The remainder of the section will then discuss some promising directions for this type of research.

6.1 Experimental research

Intervention based play research includes a focus on doing ‘experimental research’. There is a difference between (A) showing the possibility of a new technology, exploring the design space, or investigating specifics of an interaction, and (B) showing the effect of certain concepts, choices, or designs. While for the former it suffices to make one design, discuss some of its hypothetical possibilities while reporting successful user experiences but refraining from any conclusion on causality of design elements (e.g. [222]), for the latter more advanced experimental research should be done.

Claims regarding a certain guideline, fit for purpose, or design element are made more powerful when there is a comparative study between such a choice and an alternative. It is important to actually evaluate and compare multiple design options in order to show that a suggested design decision was indeed of influence.

We noticed that a comparative experimental design of evaluation is targeted by many in this field. To create interesting comparisons different versions of concepts can be developed to investigate their influence, for instance in a broader context of a research through design approach [179].Footnote 56 The comparative experimental design of evaluation is shown in use cases where physical play behaviour is deliberately changed with design elements [24, 26, 125], with HUGs regarding incorporating HR or not [90] and incorporating physical or virtual objects [87], and regarding the effect of embodied interaction on social presence, social interactions and bonding [19, 47, 124], on video quality [47], and on engagement [19] and excitementFootnote 57 [155].

In other words, in many lines of research related to CAPs, we see a focus on making a difference between intentions and effect. This focus makes a difference between design options allowing for certain behaviour to occur and ‘proofing’ it actually encourages, promotes, or elicits it. Luckily for us all, it is also easier to compare two relative experimental variables than it is to prove that a single one works [165]. Therefore, we emphasize the importance of using an appropriate experimental design in Intervention Based Play Research. By investigating interventions (e.g. design options, user characteristics, or a certain context) in a well-structured comparative experimental design with randomised control groups or ‘randomly’ assigned conditions, we try to exclude that the encouragement of the wanted behaviour is not merely the effect of context or the very nature of the players during evaluation (testing, instrumentation, etc.). This provides us with comparative results. Unfortunately this also often requires a controlled study set-up which results in a less ‘holistic picture of how children [or players in general] play’ [181].

We have also seen that many researchers show that their research fits their underlying argumentation for developing certain kinds of systems. For example, the effects that they measure in experiments relate to the impact that they set out to achieve. Their motivation and argumentation can focus on the research contributions, or focus more towards end-user related goals. In order to work towards achieving the underlying argumentation, it seems good to also actively promote certain kinds of behaviour. We can, and probably should, investigate what elements of a design elicit such positive effects. We can make use of the possibilities introduced with the introduction of interactive technology during play. For instance, the introduction of a controller that requires embodied interaction can have positive effects on goals such as increasing physical activity and social interaction [19]. These kinds of interventions, when chosen well and evaluated appropriately, fit into intervention based play research: an interaction-design oriented approach that should add both from the end-user perspective as well as from the research perspective (the knowledge base) while using scientific comparative experiments for evaluations.

Obviously, one should take care not to over generalize and to be ‘reading too much into the data’, especially when a single group of children is involved in combination with statistical methods [186]. Furthermore, there are important formative evaluations that often require more qualitative insights, that might benefit from being investigated in a more efficient manner. These formative evaluations are necessary to get to a good design, and sharing such findings can also be informative for others. Therefore, our suggestion for following an intervention based research approach focuses more on ‘end’ evaluations when interaction design needs to be less explorative.

We will now mention opportunities of interactive play that fit the intervention based play research approach, and could help to bring the researcher perspective and the end-user perspective closer together.

6.2 Adaptive, balancing, and steering interactions

Poppe et al. mentioned that stimulating behaviour change during play and incorporating adaptive systems can be fruitful directions for interactive play research [2]. Smart solutions that balance based on the players’ effort seem to be promising for allowing people with different physical skills play together [121, 226].

Play can also be actively steered to temporarily increase or decrease activity [24, 68]. Steering refers to reaching goals by the deliberate introduction of interactions that change in-game physical play behaviour in desired directions [146]. This steering is closely related to what Altimira et al. called inducing behaviour and might allow us to improve the experience during longer lasting sessions, balance a game, and cause people to move more or interact more [68, 123]. Furthermore, some players might not (be expected to) be as socially involved in the game as the others. This could be sensed or set by a facilitator, and subsequently the game could give this player another role and/or have them lured into the play by others [26, 44]. In a learning context for children with hearing impairments, where making mutually understandable vocalisations between children is one of the goals, a game can stimulate the use of hands to prevent sign language from being used [138]. In these types of context there is an opportunity to influence children’s play in desired directions while they are playing. This steering of interactive play behaviour during the game is triggering a change of behaviour in wanted directions with playful elements. It also seems to be slightly different from most ‘traditional’ persuasion and nudging activities. It does not primarily aim to change long-term lifestyle behaviours outside the game, such as smoking, (un)healthy diets, medication intake, or daily level of physical activity. It is different from constraining behaviour [67], or manipulation and deception [227], even if the participant might not perceive this as such the first time: it does not deliberately hide options or enforce a way of interaction by making it the only means of input. Instead, and similar to using different ways to explain suggested use to people [134] or explicitly leaving it out for intended ambiguity [223], it tries to change the play interaction itself: influence the players’ activity, performance, or role, change the interactions between players, the locations players visit, or the type of interaction players perform [24, 26, 44]. Further investigation of such techniques might bring us closer to successfully addressing the goals we have mentioned as argumentation.

6.3 Addressing more modalities more

From Table 1 it is clear that the sound and visual types of feedback are predominately used. Especially uncommon are addressing the in general more exotic modalities for HCI: olfactory (smell), gustatory (taste), haptic-thermoception (temperature), haptic-nociception (pain, as well as itching), and equilibrioception (balance). We have already pointed out recent endeavours that showed intriguing systems when several modalities are targeted more in depth, leading to thrilling experiences [112], cross-modal interactions [64, 225], or making interactive spaces accessible for people with certain disabilities [70, 138].Footnote 58

Depending on what one defines as multimodal one might question whether some CAPs that seem to respond only to body movements should be considered to be multimodal. Following Oviatt’s description of ‘two or more combined user input modes’ [160, p1] would identify them as not being multimodal. Turk’s description would allow for identifying them as both: ‘a channel describes an interaction technique that utilizes a particular combination of user ability and device capability (such as the keyboard for inputting text.... Multimodal interaction, then, may refer to systems that use either multiple modalities or multiple channels’ [159, p191]. Whereas the description of Bekker et al. in [180, p329]: ‘we use the word modality to indicate a form of sensory output of the play object’, would identify most of the CAPs of this paper multimodal, see Table 1.

Research by Oviatt [160] of a more traditional HCI setting on how multimodal interaction (voice and pen input) changed communication, how providing the possibility for multimodal input did not mean users would interact using that input, how additional modalities could be complementary but also asynchronous, triggers the question how addressing more modalities would change interaction in CAPs. Besides changing the interaction possibilities from a human perspective, multimodal cues might also be used differently and for other types of sensing: for instance, currently we are looking into whether sound levels combined with movements of players can help to make a system adaptive to ‘engagement’ in play. Furthermore, from the indicated examples of sensory modalities (related to user abilities) by Turk, we see that many are (almost) not used, such as facial expression, lip movements (which might have changed interaction for children with a Cochlear Implant [138]), speech input (for simple instructions), other types of non-speech audio, and face-based identity (perhaps suitable to create personalised, adaptive, and adaptable CAPs).

At this point it is good to emphasize that the input modality for a system is not one-on-one related to the accompanying sensory modality of the human: using an example from CAPs microphones might be related to a human making sounds but can also be used to primarly measure movement or touch in a malleable interface [16]. We also noticed this difference between on the one hand, modalities for system input and output, and on the other hand, human action and perception, when we created the overview of ways of augmented feedback presented in Table 1. Early on in the process we made the decision to indicate the modalities that where augmented by the system and using a category based on triggering the player’s senses (where a single LED might be perceived different than a screen or than a projection) and the table is discussed as such. However, it is also interesting to see how an interaction can trigger stimulation of another human sense, for example triggering a feeling of vertigo with a non-augmented swing. We have seen both an example of this swing being moved by the system [112] and of it being actived by the user [113]. This raises the question, where we ourselves have not yet looked into, whether humans would perceive the augmented feedback differently in a play-space, and if this is not the case this also limits the accuracy or usefulness of Table 1.

Another way in which addressing modalities is important is in the way they are reported. During the creation of the overview presented in Table 1 we noticed that descriptions of visuals are in general more detailed compared to the other modalities. Sounds in particular were often described in very minimalistic ways. This might be inherently related to the traditionally paper-based medium of writing scientific papers. We suggest that for audio this could partially be resolved with a movement towards including and referring to videos and giving access to audio files, as is possible for this current journal for this reason, and is promoted for conferences such as CHI and SIGGRAPH. In turn this can also make it a more prominent and recognised part of scientific dissemination. Other modalities will have to do with descriptive analogies and metaphors, at least for the time being.

6.4 Beyond first time use

Many studies on pervasive play-spaces focus on first time use [1]. Due to the novelty of interactions such studies are often heavily influenced by first time use; in the longer run behaviour might change. This could also mean that the effect of design elements can change on a longer term, and children might become less inclined to play again after several sessions [181]. For instance, sounds might be of added value in the beginning but could annoy people (especially adults, neighbors or bystanders) if they are monotonic, uniform and are played over and over again with limited variations [113]. Bekker and Sturm already suggested in 2009 that showing the true promise of interactive play(-objects) also requires longitudinal studies [179]. At the same time, Hof et al. noticed that performing a user test several times (three times, once each week) with observations and questionnaires with the same groups of children is already very difficult to arrange [181].

The use of automatic measurements might aid in such play analyses in the longer term. The commercial systems of Kompan and Yalp provide logs of how long which game is played, they can also be updated from a distance. With the increasing number of playgrounds sold around the world (over a 100 for some systems), it could become interesting to start scientific research with these systems, and long-term tests using A/B testing, investigating certain game elements and design pattern, and then evaluate whether it affects the game play in order to inform future design.

The ability to update the systems over a distance also allows for changing the content in order to keep it up to date following contemporary trends, and regarding to some aspects (e.g. a quiz) keep it unpredictable. Both features, the automatic logging of game play and structurally changing interesting game elements similar to the ones mentioned earlier, allow for studies on a longer term leading to interesting insights, seemingly providing an interesting way to bring the research perspective closer to the end-user’s perspective.

6.5 Another context: escape-the-room games

One currently popular context for play-spaces are escape-the-room games. Additional technology in these rooms allows to create new dedicated devices and tools for thrilling puzzle activities, for instance, using video conferencing systems in two similar but slightly different rooms allows to distribute the play in interesting ways[73, 74]. We also see ample opportunities for integration of other existing technology including Virtual Reality and Augmented Reality systems. We use the work of Pan et al. and Shakeri et al. to speculate that using such technologies (with inherent altered, private, and public information) could create augmented real-life escape rooms that can trigger types of playful embodied interaction to stimulate verbal communication, sharing of information, and solving augmented puzzles [73, 74]. We also envision that players can be assigned specialist roles in the escape rooms giving unique abilities and thereby forcing certain social roles (c.f. [44]. Furthermore, such a context also allows for investigating most of the promising directions we have discussed. We envision experimental research with similar groups visiting similar but slightly different rooms, where aspects of interest can be altered. We see opportunities to steer behaviour to let players investigate social roles or balance between player skills by providing hints in a more adaptive manner. The systems can include more exotic modalities, where players might be required to eat, use cool/heated devices, incorporate released smells in order to solve puzzles, or to encounter in pleasantly painful or equilibrioception-rich (targeting sense of balance) experiences.

7 Conclusion

Our survey shows that research into Co-located Augmented Play-spaces (CAPs), (interactive play systems or interactive playgrounds) contains a variety of research topics, directions, outcomes and approaches. With this manuscript we have summarised several aspects that can be of interest for researchers in this field, perhaps inspiring new combinations of work.

We have also reiterated a possible way to further research in the field, a way used by many scientific researchers: intervention based play research, an interaction-design oriented approach that should add both from the end-user perspective as well as from the research perspective (the knowledge base) while using scientific comparative experiments for evaluations. We mention two approaches that seem to fit well with this approach: (1) turning to longer term use probably making use of automatic measurements and existing commercially available playground installations, (2) making use of the mediating powers of interactive play in order to adapt or steer play to better satisfy the envisioned end goals we try to fulfil for the users, (3) pointing to the importance of addressing all modalities in reporting, design, and evaluation, and (4) suggesting escape-the-room games as a suitable context where some of these elements can come together.

The overview might function as a guide for a new generation of Ph.D.s and researchers as it puts together various core works and researchers of this field. We invite others to broaden the research perspective and expand the playing field.