1 Introduction

Currently, many autonomous mobile robots still operate in environments without people. As robot tasks are extending to human environments, they will share the space in which they work with humans. Current examples include mobile robots with a dedicated task that concerns humans, such as museum tour guides [17] or robots that care for the elderly [5]. These kinds of robots obviously encounter humans when performing their tasks. Additionally, it is also important to take robots in a more industrial context into account [25]. For example, autonomous guided vehicles (AGVs) in a warehousing environment performing pick-and-place tasks also are likely to encounter humans.

When mobile robots operate in the same environment as humans, they first and foremost need to be safe [40]. This is usually regulated by limiting the speed and optimizing the sensing capabilities of the mobile robots, to avoid collisions. Apart from physical safety, people also need to feel safe and comfortable in the proximity of the robot. For example, the robot might be moving safely, but people can still feel discomfort or lack trust in the behaviour of the robot [21]. To prevent this, it is important that robots take the comfort of humans into account when planning their paths and navigating through an environment. If people feel comfortable around robots, it can be considered more likely that robots will be accepted in human environments, which affects the added value of robots. For this to happen, robots should take social rules into account and respect socially acceptable distances [27]. In this paper, we aim to experimentally investigate these distances. A seemingly straightforward approach is to implement how humans distance themselves from each other, which is called proxemics.

Fig. 1
figure 1

Different representations of personal space. a Circular shape [11, 14], b More space at frontal zone [3, 13], c More space at rear zone [4, 28], d Elipsical shape [15], e Assymetrical shape [10]. Picture based on [32]

1.1 Human Proxemics

Proxemics is the field of research in psychology that studies how people utilise the physical space around them and position themselves with respect to each other [11]. An important concept in proxemics is personal space. Humans subconsciously take the personal space of other humans into account when navigating an environment and having social interactions [20]. It causes discomfort if the distance between two people does not match the kind of interaction that they have [12].

Hall [11] described personal space as a set of concentric circles. Based on observations and interviews, he recognized four different circular zones, all suitable for different kinds of interactions. The closest zone is the intimate zone (\(\le \) 45 cm), which is meant for intimate interactions, such as embracing, wrestling, or whispering. The personal zone (45–120 cm) is for interactions at arm’s length, two people can still touch fingers if they extend their arms. Most interactions take place in the social zone (120–350 cm), which is used for having a conversation. The furthest zone is called the public zone (> 350 cm) and this is used for public speaking. Hall acknowledged that the distances of these zones are highly dependent on cultures and individuals.

More recent work agrees with the notion of Hall that personal space is circular [14]. This was tested by approaching participants both in a real and in a virtual environment. However, there have been other findings on the shape of personal space. Others have found a non-circular shape, claiming that people need more frontal space [3, 13]. This was tested both by approaching participants in a real-world environment [13], and by letting participants approach virtual humans (embodied virtual agent) [3]. A possible explanation of these findings is that the possibilities of actions that we can perform (e.g., walking, grabbing, and pointing), usually extend towards the front. The explanation can also be found in social moderators, like eye contact. Bailenson and colleagues [3] found that in conditions where the moving gaze of a virtual human tracks the approaching participant, the participant will keep a larger interpersonal distance to the front of the virtual human. These results are in line with the Equilibrium Theory [2] which poses that factors such as interpersonal distance and eye contact exist in an equilibrium of intimacy. If people make eye contact, the interpersonal distance should be larger to prevent the interaction from becoming too intimate. When there is no eye contact, for example, when approaching someone from the back, interpersonal distances can be smaller [1].

In other studies, researchers found the opposite result; i.e. people needing more space behind them [4, 28]. In both studies, participants were approached by the experimenter. Needing more space in the back is explained in these studies by having fewer sensory inputs behind a person. In front, people can use vision to orient themselves. In the back people cannot use vision, which gives them less information on the location and activity of a person. Less information means that people will be more uncertain about the location and activity of a person, which can make them feel less safe or uncomfortable.

All of the models presented above are considering people that stay in their original position. If people are walking around in the environment, it makes sense to also take into account their direction of motion. In several potential field models, usually based on mathematical assumptions, representations of people depend on their motion direction. In the Social Force Model [15], for example, the path of people is modeled using forces: attracting forces from their target position and repulsive forces from other people in the environment. These forces are modeled in their direction of motion and the accompanying function has the shape of an ellipse, which creates a fourth possibility of the shape of personal space, next to circular or more space at the frontal or rear zone (see Fig. 1).

So far, all the possible shapes of personal space that we discussed were left–right symmetrical, but some studies suggest that personal space may be asymmetrical [6, 10]. Gerin-Lajoie et al. [10] asked participants to bypass an obstacle (both in a real and in a virtual environment) and observed that people usually kept a smaller distance to the obstacle at their dominant hand side. The same shape can be observed in traffic. In [6] it was found that in places where traffic drives at the right side of the road, pedestrians also usually pass other people at the right side of the corridor or sidewalk. This could indicate that people require more personal space at their right. However, this contradicts the findings of [10], assuming that for most people the right-hand side is their dominant side. This could be explained, by the fact that passing at the right-hand side is the usual social convention, but if there is no clear rule or task and people are allowed to choose freely, they are comfortable with less space at their dominant side.

It could be that the differences in shapes of personal space between the studies described above can be explained by differences in research methods. In testing the shape of personal space the participant can be either the approacher or the one who is approached, the person can be considered static or dynamic and the environment can be virtual or real. However, the method does not always predict the shape that will be found, because similar methods, such as approaching a person in a real-world environment, can give different results [13, 14, 28].

In Fig. 1, five different shapes of personal space are visualized. It appears to be largely unclear which shape is the best representation of comfort zones of humans. The shape of personal space seems to depend on a variety of contextual and interactional variables, including the task content, the speed and direction of movement of the interactants, and the presence or absence of strong social cues, such as mutual gaze. However, more studies will be required to disentangle these factors, and to assess their individual and combined contributions to the shape and size of personal space over time and in different contexts.

1.2 Human–Robot Proxemics

It is not immediately obvious that human proxemics also applies to robots, especially when they are not humanoid. This means that we have to determine how comfortable people are with different distances between themselves and robots. Most research in this area has been done for robots with a dedicated social task, such as care robots. Several studies have investigated the relationship between personal space and perceived comfort, with differing results [32].

Walters et al. [41, 42] performed several studies where a robot approached a human, or vice versa. They found that about 40 percent of their participants were comfortable with an approach distance in the intimate zone as defined by Hall (< 45 cm). The authors explained this finding by hypothesizing that these participants did not regard the robot as a social entity. As a result, they would not mind a close distance. Other participants in their sample preferred regular approach distances comparable to interactions with humans, in the personal or social zone of Hall (around 120 cm). This means that the approach distance seems to depend on personal preferences. A later study demonstrated an opposite effect. Torta et al. [39] had a small humanoid robot approach senior participants and asked them to stop the robot at a comfortable distance for interaction. They found much larger distances than usual for human interactions, ranging from 160 to 180 cm.

The studies described above differed on several aspects: type of robot, the user group, and the behaviour of the robot. Proxemic preferences concerning a robot approaching a person seem to depend on a variety of factors, so it is important to take the precise context into account [19]. For example, the behaviour and perceived skills of the robot determine how close the robot is allowed to come [26]. When it appears as if the robot cannot understand a person unless it is close enough, people will take this into account when determining an appropriate distance. Also, the activity and personality of the user [33, 39] influences on proxemic preferences. Among others, this shows that the precise shape and size of personal space of humans with respect to robots depends on the particular type of interaction.

Most of the studies described above focus on approaching a person. However, in many practical applications, robots do not have to directly interact with people, for example, when they are cleaning the floor [9]. They will merely avoid a collision with a person, but while doing that, they also need to take human comfort into account. There are some experimental studies on comfortable passing distances [27, 29], where the main results are that larger distances relate to higher comfort levels and that passing side (left compared with right) has no effect on comfort. If we look at passing at the back or front of persons, Lichtenthaeler et al. [22] investigated appropriate and comfortable crossing strategies. According to these results, stopping was the most natural strategy for crossing paths with a human. Lo et al. [23] compared crossing strategies and found that the strategy where the robot accelerated or deviated from its path was perceived as most comfortable. To the best of our knowledge, the difference in level of comfort between passing in front or in back of a human was never studied.

1.3 Human-aware Navigation

If a robot needs to navigate around people, possibly proxemics rules for interactions do not hold [24]. Much work has been done on how to incorporate humans in a navigational model. This specific field of study is known as human-aware navigation [21] and the focus is on path planning in an environment where humans are present. There are several possibilities for a human-aware navigation algorithm, for example using deep learning to recognize human activities [38], (inverse) reinforcement learning of pedestrian behaviour [16, 31] or using a cost map to represent a person in a navigational framework.

An example of the latter option is the Human-Aware Motion Planner [36, 37]. With this motion planner, robots are expected to move safely, reliably and in a socially acceptable manner around humans. People are represented by a cost function that prevents robots from coming too close to them. Furthermore, robots are expected to be as visible as possible to humans. This means that they will choose a path in front of the human over a path behind the human. Also, a surprise effect is prevented, by preventing the robot from appearing in view of a human too sudden, when passing an obstacle. In this model, humans are considered to be non-moving.

In research by Satake et al. [34] the movement of people is taken into account by predicting a person’s walking behaviour. From this walking behaviour the robot assesses whether a person is willing to interact with the robot. After navigating to this person, the robot attracts the person’s attention and starts a conversation. Kirby and colleagues [18] also considered people to be moving entities and therefore they modeled people with a personal space that extends towards the front. In this framework, social conventions, such as keeping to the right of a hallway, are taken into account.

In another work [7], also the perceived actions of humans are taken into account. A person is represented by a Gaussian function in a cost map that depends on the perceived action: i.e. walking means larger space needed in front of the person. Similar approaches are taken by Papadakis et al. [30], where humans can indicate a passing side for the robot using a gesture, which is detected by the robot and taken into account when planning the path. Furthermore, humans in conversation will be recognized as such, which prevents the robot from interrupting the conversation, but rather plan a path around the humans.

Summing up, human-aware navigation methods always include people as a unique entity, other than obstacles. They differ in whether people can be moving or not and whether human behaviour is taken into account. The goal of human-aware navigation is to let the robot plan a suitable path around the person. To test the comfortability and efficiency of a human-aware navigation framework comparable methods are used. In all frameworks described above, humans are represented in a specific way, taking their motion, visual field or actions into account. However, there is little to no research investigating the appropriate shape and size of personal space of a person that is passed by a robot. Therefore it is important to validate or investigate the human representations in navigation frameworks.

1.4 Research Aims

The aim of the current study is to experimentally determine the shape and size of personal space of a person when the robot is passing by. Since personal space is related to (dis)comfort [12], we focus on perceived comfort to indicate the shape and size of personal space. We expect perceived comfort to increase with distance between robot and human. Based on a similar study on robot proxemics [27] we also expect no difference between the left and right side of a person. Many sources on human proxemics assume that the shape of personal space is roughly circular [11, 14]. We explicitly test this by including passages in front and at the back of a person.

To test our expectations we used a controlled environment in which a robot passes people at different sides (left, right, front or back) and distances to be able to give an indication of the shape and size of personal space. After each passing people indicate their feelings of comfort. We did two experiments, one with a humanoid robot and one with a non-humanoid robot, to be able to compare the results in different contexts. Furthermore, using different robots we can distinguish between robot-specific results and more general results.

2 Method

To test our hypotheses, we conducted two similar experiments. Study A involved a humanoid robot and study B a non-humanoid robot.

Fig. 2
figure 2

Experimental set-up for a study A and b study B. The codes around the figure represent the 24 different starting locations of the robot. The first letter represents the passing side (Left (L), Right (R), Back (B), Front (F)), the next set of letters the starting position (clockwise (CW) or counter-clockwise (CC)) and the number the distance in cm (50, 70 or 90 in study A and 60, 80 or 100 in study B). The figures are scaled, as are the dimensions of the two robots that are used

Fig. 3
figure 3

Picture of experimental set-up for a study A and b study B, in which the robots used are shown. In study A the Pepper robot (Softbank Robotics) was used, in study B a custom-made AGV was used. Participants were standing on indicated footsteps

2.1 Participants and Design

For both studies participants were sampled from the participant database at Eindhoven University of Technology. A large part of this database consists of students at the same university but there are also some working or retired people from the Eindhoven region registered. All the participants in the current study had little to no experience with robots, limited to vacuum cleaner robots at home and participating in other experiments involving robots.

Study A Twenty participants (8 males and 12 females, \(M_{age}\) = 28.1, \(SD_{age}\) = 16.9, range: 17–82) participated in study A with a 4 (passing side: front, back, left or right) x 3 (passing distance: 50, 70 or 90 cm) x 2 (starting position: clockwise or counter-clockwise) within-subjects design. All participants experienced all 24 trials, as visualized in Fig. 2a. The robot started in one of these 24 locations and passed the person, standing in the middle of the lab. Passing distances were measured from the center of the human to the center of the robot. The starting position variable indicates from which starting point the robot passed by for any given side of passage. Starting point clockwise indicates that the robot moved in a clockwise fashion. For example, when passing in front, it started on the left (FCW), and when passing on the right, it started in front (RCW) and so on. Similarly, counterclockwise indicates that the robot moved in the opposite direction. The order in which the trials occurred was semi-random, which means that the first trial of the robot was random and afterwards the passing distances would either decrease or increase at the same passing side, during which the robot would alternate between a clockwise or counter-clockwise pattern. After three trials in the same direction, a new direction was tested.

Study B Twenty-one participants (6 males and 15 females, \(M_{age}\) = 33.0, \(SD_{age}\) = 15.7, range: 20–66) participated in study B. None of them had participated in study A. The design was similar to study A. The passing distances were all 10 cm larger (60, 80 or 100 cm) because the robot used in study B is approximately 20 cm broader than the robot used in study A, as shown in Fig. 2b. Since passing distance represents center-to-center distance, this means that the clearance was comparable across both studies. The randomization of the order of the trials was identical to study A.

2.2 Materials and Measurements

In study A we used the Pepper robot (Softbank Robotics), which is shown in Fig. 3a. This robot has the following dimensions (L: 425 mm, W: 480 mm, H: 1210 mm). The speed of the robot during the experiment was set to 0.35 m/s, as this was the default setting for the Pepper robot and the robot exhibited no explicit social cues (e.g., eye contact) during passing. In study B we used a custom-made Autonomous Guided Vehicle (AGV). This robot has the following dimensions (L: 730 mm, W: 650 mm, H: 900 mm). The speed of this robot was also set to 0.35 m/s to be able to compare the robots and there were no explicit social cues during passing.

After each trial participants were asked to answer one question: “How comfortable were you with the passing of the robot?”. They answered this question on a 7-point scale, ranging from ‘not at all’ to ‘very much’.

Fig. 4
figure 4

Comfort ratings for each passing distance, passing side and starting position for study A (a) and study B (b). Error bars represent standard errors

2.3 Procedure

The procedure was the same in both studies. Participants entered the lab and signed an informed consent form. After this, they were instructed to stand in the middle of the room on the indicated footsteps (shown in Fig. 3). They were allowed to look around, as long as they kept their shoulders straight as much as possible (to prevent them from looking back). Next, the robot passed them. Participants were instructed to stand still on the two oval shapes that were taped to the floor. After each trial participants walked to a computer in the lab to rate the perceived comfort of the previous passing. This was repeated 24 times. After the last trial they answered questions on demographics and their shoulder width and waist circumference were measured. At the end, they were thanked for their participation. The total experiment lasted 30 minutes for which participants were paid €5, or €7 for people that needed to travel.

3 Results

Before testing our hypothesis we checked whether any of our demographic variables (age, gender, nationality), had a significant effect on comfort levels. Results of the Spearman correlation show a small but significant positive correlation of age on comfort (r(984) = 0.13, \(p<\) .0001). This effect shows that older people are more likely to give higher perceived comfort values. We did not have enough spread in ages, nor did we have specific expectations on effects of age. Therefore, it is currently unclear how this effects originates. The other demographic variables show no significant difference for comfort.

3.1 Comparing Comfort Levels

The results of study A and B are presented together to aid comparison. Participants’ average shoulder width was similar between study A (M = 41.6 cm, SE = 0.78 cm) and study B (M = 41.1 cm, SE = 0.76 cm). The average waist circumference was larger in study A (M = 82.7 cm, SE = 2.39 cm) compared to study B (M = 78.3, SE = 4.34 cm), but not significantly so (p = 0.39, t(39) = 0.88). Thus, we can compare the findings of both studies without adjusting for distance.

We conducted a repeated measures Analysis of Variance (rANOVA) to test whether comfort was significantly influenced by our three predictors, distance (50, 70 or 90 cm), passing side (front, back, left or right) and starting position (clockwise or counter-clockwise). The average values of comfort across all the different conditions of both studies are visualized in Fig. 4.

In study A we found a significant effect of distance on comfort (F(2,42) = 164.55, \(p<\) .0001, \(\eta ^2\) = 0.43). This means that if the distance between the human and robot is greater, perceived comfort is higher. In addition, there was a significant effect of passing side on comfort (F(3,42) = 27.26, \(p<\) .0001, \(\eta ^2\) = 0.16). Perceived comfort was highest in the front (M = 5.47, SD = 1.60), and lowest in the back (M = 4.19, SD = 1.47). Perceived comfort was similar between the left (M = 4.69, SD = 1.71) and the right (M = 4.94, SD = 1.70).

Fig. 5
figure 5

Perceived comfort as a function of the minimal passing distance for study A (a) and study B (b). The points (shape symbols) represent the average comfort level indicated per condition. The colored lines represent fits of the inverted Gaussian per passing side. Error bars represent standard errors

Between distance and passing side, we found a small but significant interaction effect (F(6,42) = 3.63, p = .002, \(\eta ^2\) = 0.05). For passing at the back comfort increased less with distance compared to the other passing sides. Another small significant interaction effect was found between passing side and starting position (F(3,42) = 6.51, p = .0003, \(\eta ^2\) = 0.04). This finding is most clearly visible for a distance of 70 cm (see Fig. 4a). When comparing comfort between left and right there is a clearly visible difference between the clockwise and counter-clockwise starting position. This represents the robot starting from the front compared to coming from the back of the person. In both cases, comfort is higher when the robot starts at the front. There were no other significant effects in study A.

In study B we also found a significant effect of distance on comfort (F(2,43) = 84.16, \(p<\) .0001, \(\eta ^2\) = 0.27). Similar to study A, perceived comfort increased with distance, but the effect appears to be smaller than in study A. We also found a small significant effect of passing side (F(3,43) = 4.12, p = .007, \(\eta ^2\) = 0.03) on comfort. Again, similar to study A perceived comfort was highest when passing in the front (M = 5.36, SD = 1.70), and lowest in the back (M = 4.87, SD = 1.75). Perceived comfort was similar between left (M = 4.97, SD = 1.69) and right (M = 5.05, SD = 1.82). There were no other significant effects or interaction effects in study B. In Fig. 4, both studies show a similar trend for a distance of 80 cm: a small preference for the robot starting at the front of the person. However, in study B this interaction effect is not significant (F(3,43) = 1.85, p = .19)

Table 1 Fitted parameter values of Equation 1, for all passing sides and both studies. \(\theta \) corresponds to the polar angle of the passing side. \(a_0\) and \(\sigma \) are parameters of the fitted inverted Gaussian (indicated in Equation 1) and also the standard errors (SE) of these fit parameters are given. All parameters are significant with p < .001

Comparing the results of study A and study B on comfort values we see that the overall perceived comfort was a little higher for the non-humanoid robot (M = 5.06, SD = 1.75) than for the humanoid robot (M = 4.82, SD = 1.68). Comparing the averages of perceived comfort in the different passing sides, the only significant difference between the different studies is found for passing at the back (t(244) = −3.25, p = .001, Cohen’s d = 0.042). Perceived comfort was higher for the non-humanoid robot (M = 4.87, SD = 1.75) than for the humanoid robot (M = 4.19, SD = 1.47). However, because both studies consist of different groups of participants, it is more reliable to compare the fits between the studies, which we will do in the next subsection.

3.2 Fitting a Relationship Between Distance and Comfort

To quantify the differences in comfort, we fitted an inverted Gaussian that relates distance to comfort. This shape was previously found to fit very well in our previous work [27]. This particular function was chosen over a second-order polynomial because a polynomial would have a maximum and extrapolating beyond the measured range would lead to insensible results. Additionally, an inverted Gaussian can be described with only two parameters. The inverted Gaussian is given by:

$$\begin{aligned} C = 1+a_{0} (1 -\exp (-\frac{d^2}{2 \sigma ^2} )), \end{aligned}$$
(1)

where C stands for perceived comfort, d represents passing distance, \(\sigma \) is the width and \(a_{0}\) is the height of the inverted Gaussian. For the results of both studies we fitted this inverted Gaussian for all passing sides (left, right, front and back). Based on a least-squares regression, significant fit values were found for all parameters. The resulting graphs are shown in Fig. 5, and the parameters are shown in Table 1.

Comparing Fig. 5a and b, we notice that the resulting graphs are very similar for both robots. Using t-tests, we checked whether any of the parameters mentioned in Table 1 differed significantly between study A and study B, which was not the case (all t’s < 1.33, all p’s > 0.18). The only noticeable difference between Fig. 5a and b is shown in the function for the back of the robot. For the humanoid robot the difference between the other three passing sides and the back passing side is larger for greater distances, while this is not visible for the non-humanoid robot.

To investigate whether the function we use is a good representation of the relation between distance and comfort, we plotted the current data in the same Figure as our previous data [27] and also included similar data of Paccheriotti et al. [29]. The result is shown in Fig. 6. In this Figure, the distance shown is the lateral clearance: the distance between the edge of the person and the edge of the robot. Since in these studies three different robots were used with different dimensions (PeopleBot in [29], Pepper robot in [27] and in study A and a custom-made AGV in study B) center-to-center distance correspond to different clearances. It is clear that all studies show the same relation, albeit that the perceived comfort levels of Paccherotti et al. [29] are a little higher compared to Neggers et al. [27], and the results of study A and B are a little lower. Otherwise, the pattern of the data is very similar suggesting that the combined fit is an accurate description of the relation between distance and comfort.

Fig. 6
figure 6

Perceived comfort as a function of the passing distance (expressed in lateral clearance) for the data of Paccheriotti et al. [29], Neggers et al. [27], study A and study B. The points (shape symbols) represent the average comfort level indicated per condition per study. The black line represents a combined fit. Error bars represent standard errors

Fig. 7
figure 7

Schematic overview of distance d and angle \(\theta \). Distance d represents the center-to-center distance between the human and the robot at the closest point of passage, angle \(\theta \) represents the polar angle of this point. All possible angles are mentioned in Table 1

Fig. 8
figure 8

Interpolation of parameters \(\sigma \) and \(a_{0}\), using a discrete Fourier analysis to be able to calculate comfort values inbetween the four passing sides. Dots represent the fitted parameters, lines represent the result of the Fourier analysis. Error bars represent standard errors

3.3 Creating a Personal Space Model

We used the fitted inverted Gaussian as shown in Fig. 5 to create a personal space model for both robots used in study A and study B. For this, we created a contour plot that depicts the shape and size of personal space. To do so, the polar coordinates (r, \(\theta \)) of the closest point of passage are computed. The value of r represents the closest passing distance d (Eq. 1). The value of \(\theta \) represents the polar angle of the closest point of passing. The meaning of these values is shown in Fig. 7. This means that passing on the right corresponds to a \(\theta \) value of 0. Passing at the front, left and back correspond to \(\theta \) values of \(\pi / 2\), \(\pi \) and \(3\pi / 2\) respectively. The angles of the four directions that we tested are also indicated in Table 1.

The parameters of the inverted Gaussian differ for each value of \(\theta \) (side of passage). To interpolate between our four passing sides, we used a discrete Fourier analysis to be able to calculate the comfort value in-between the four directions. The resulting periodic functions of the Fourier analysis are shown in Fig. 8. In this Figure two cycles of the functions are shown, however, the functions wrap around after one cycle (e.g., \(\theta =2\pi \) represents the same location as \(\theta =0\)). For more information on the discrete Fourier analysis see [8].

Using the parameters found in the Fourier analysis, we can plot a comfort value for each location in the surroundings of a person. The resulting contour plots are shown in Fig. 9. These contour plots can be used to depict both shape and size of personal space. The shape is clearly visible, in both contour plots it is clear that comfort is lower at the back of a person. The size of personal space depends on the task constraints. If in a certain context, robot efficiency is more important than human comfort, a lower level of comfort may be considered acceptable, e.g., 3.80, as shown in Fig. 9. This will result in the robot being able to pass close by a human, to increase its efficiency. On the contrary, if human comfort is more important, a higher level can be chosen, e.g., 5.55, as shown in Fig. 9. In this case, the size of personal space is bigger and the robot will keep more distance to a person. Furthermore in the latter case also the shape is different. With a comfort level of 3.80 the shape is rather circular, but with a comfort level of 5.55 it is elongated at the back.

Fig. 9
figure 9

Shape of personal space for study A (a) and study B (b). The values indicate comfort level on the associated contour. The arrow represents a human with the front directed to the top of the figure. Axes represent the position in centimetres from the center of the human

4 Discussion

When robots are navigating in human environments, they need to be able to represent the personal space of humans correctly. In the current study, we experimentally investigated the shape and size of the personal space of an individual who is passed by a robot. To study this, both a humanoid robot as well as a non-humanoid robot passed participants in a lab at different distances and sides. After each passage, participants were asked to indicate their perceived comfort. Results indicated that comfort increased with distance, which is in line with our earlier work [27] and consistent with [29]. Furthermore, especially in the case of the humanoid robot, passing at the back of a person feels less comfortable than passing at the front, which makes the shape of personal space not circular, contrary to what we expected. Furthermore, the humanoid robot coming from the back is perceived as less comfortable compared to coming from the front.

We did not find a difference between passing on the left and right of a person, which matches our previous findings [27]. Although several studies report a preferred side or an asymmetrical shape of personal space [6, 10, 39], we did not find it here. Assuming our participants were mostly right-handed and were used to keeping right in traffic, a preferred side would be easy to understand. The apparent absence of a difference between comfort on the left or right might be explained by the context of the current experiment. Task constraints are able to supersede social conventions or preferences [18] and since the current task was very straight-forward, it could be that there is no clear preference.

We fitted the results with an inverted Gaussian, and showed that this function is a good fit for the relation between distance and comfort. The results of different studies (the current study, [27] and [29]) can all be described by the same function. The only difference is that for the results of our earlier work [27] and the work of Pacchierotti et al. [29] comfort seems to be a little higher overall than for the current results. This may be due to the differences in study design. In our earlier work [27] and the work of Pacchierotti et al. [29] participants were walking in a hallway while passing the robot. In the current study, participants were asked to stand still on the indicated footsteps. It could be that walking gave them more feeling of freedom to avoid the robot and therefore they rated lower distances as more comfortable. Additionally, the set-up of our earlier study [27] and the study of Pacchierotti et al. [29] also differed, in the aspect that the robot actively took a greater distance to the person while passing in [29], while in our earlier work [27], the robot drove along the same line. It appears as if the first option is more comfortable, looking at Fig. 6. Overall, we argue that the inverted Gaussian captures the relation between comfort and distance rather well, but the quantitive numbers of comfort seem to depend on context, e.g. were people moving or not, and did the robot actively diverge from its path.

We used this function to create a contour plot of the shape of personal space. Since passing at the back is less comfortable, the resulting shape is not circular. The shape of personal space that we found in the current study roughly matches the findings of previous research in human proxemics [4, 28]. It was indicated that people preferred to have more space at the back. Clearly, people cannot use vision at the back. This makes it harder to determine where the robot currently is and where it is going. We think that this can also explain the findings of the current study. Some participants indicated that they were uncomfortable because they heard the robot and even felt the floor vibrate as it passed behind them, but they could not see where it was or where it was going. Presumably visual feedback and predictability are important for perceived comfort.

Another possible explanation of this finding is that people were focused on the robot because their task in the experiment was to rate their perceived comfort with the passing of the robot. If people would be distracted or mentally occupied with a different task, it might be that they would not even notice that the robot passed them at the back. Additionally, if this would be the case, passing at the front could even be more uncomfortable because of a surprise effect when they suddenly notice the robot in their visual field while being distracted [37].

Incorporating the current findings in the Human-Aware Motion Planner [37] is relatively straightforward. In this motion planner a robot needs to choose a path in front of a human over a path behind a human to avoid an undesirable surprise effect. Their representation of human proxemics in a cost map can easily be related to the personal space model obtained in our current work. Our results can also assist other human-aware navigation methods in how to represent humans in a cost map when planning an efficient route from A to B. The contour plot of human comfort that we created (see Fig. 9) can be used to represent non-moving humans. A lot of path planners use cost maps to plan an acceptable and efficient path. This cost map can be used in a trade-off between human comfort and robot efficiency, using different sizes of personal space dependent on context-related task constraints. For future research, it would be interesting to compare the cost map obtained in current study with pedestrian data acquired in the real world [31], to see whether the findings hold for people navigating amongst each other.

If we compare the findings of the humanoid robot with the non-humanoid robot it appeared more uncomfortable when the humanoid robot passes at the back than when the non-humanoid does this. A possible explanation of this finding could be that the humanoid robot is regarded as more social than the non-humanoid robot. Intrusion of another social actor in your personal space causes discomfort, while it is not an issue to stand close to a non-social obstacle. Presumably, something similar applies if the robot is moving. In other research where a moving robot approached people and some people let it come as close as possible, the same explanation was given [41]. However, this does not explain why this difference is mainly visible when the robot passes a person at the back, so further research is necessary to investigate this phenomenon.

Except for the difference in the back, the results of the humanoid and the non-humanoid robot are very comparable. This suggests that these results could also apply to other robots.

4.1 Limitations and Future Work

Our work is part of a larger research programme where we are systematically measuring and modelling human responses to dynamically navigating robots present in the environment around the human. As many variables are simultaneously at play, experimental approaches require drastic simplifications of corresponding real-life situations in order to ensure methodological rigor. The traditional trade-off between experimental control and ecological validity also holds for research in social robotics.

Additionally, some participants in our study had some experience with robots in other experiments, which can impact the results. However, it is likely that all people are going to encounter robots more often in the future. Therefore, we have to take people with all kinds of experience levels into account and for that we believe the current participant sample to be representative. In future studies, it is good to take experience with robots and other individual differences explicitly into account.

In the current work, the participant stood on a fixed spot. In real life, the robot will likely mostly encounter humans in dynamic situations, e.g., when they need to cross paths or pass or overtake each other. In the current results, we see that people require more space at the back. However, it stands to reason that when a human is walking forwards, it is also important that they have enough space at the front. We are planning to validate the shape and size of personal space we found in the current research with moving humans in future research. An extra benefit of this would be that it enables us to include behavioural measures like the walking path of the humans, next to subjective measures like a questionnaire.

Another interesting aspect is to investigate movement speed of the robot. Currently both the humanoid and the non-humanoid robot had a relatively slow movement speed (0.35 m/s). In early research, it was found that people start to feel uncomfortable when a robot moves faster than the average speed for humans (1 m/s) [35]. Future research could investigate how the movement speed of the robot relates to the shape and size of personal space, especially when the robot has a higher speed than humans.

4.2 Conclusion

In the current study the shape and size of personal space of a person who is being passed by a robot were determined, both for a humanoid and non-humanoid robot. Results show that perceived comfort increases with distance. There is no difference in comfort if we compare the left and the right side of a person, but people are less comfortable with robots that pass them at the back. This results in a cost map that is non-circular and extends at the back. This cost map hardly differs for the humanoid and non-humanoid robot. We expect that the shape of our cost map for static people is similar for many robots. The size will probably depend on contextual factors. The insights gained in the current research can serve as an input for path planning algorithms for implementing human-aware navigation.