The slogan “robots will pervade our environment” has become a reality. Drones and ground robots are used for commercial purposes while semi-autonomous driving systems are standard accessories to traditional cars. However, while our eyes have been riveted on dangers and accidents arising from drones falling and autonomous cars’ crashing, much less attention has been ported to dangers arising from the imminent arrival of robots that share the floor with pedestrians and will mix with human crowds. These robots range from semi or autonomous mobile platforms designed for providing several kinds of service, such as assistant, patrolling, tour-guide, delivery, human transportation, etc. We highlight and discuss potential sources of injury emerging from contacts of robots with pedestrians through a set of case studies. We look specifically at dangers deriving from robots moving in dense crowds. In such situations, contact will not only be unavoidable, but may be desirable to ensure that the robot moves with the flow. As an outlook toward the future, we also offer some thoughts on the psychological risks, beyond the physical hazards, arising from the robot’s appearance and behaviour. We also advocate for new policies to regulate mobile robots traffic and enforce proper end user’s training.
Similar content being viewed by others
Progresses in the design of human-aware robots makes it now possible to deploy robots in human inhabited environments. The last couple of years have witnessed steps in that direction with the introduction of autonomous cars , drones and ground robots for last mile delivery services [2, 3], robots as assistant and tour guide to visitors [4, 5] and autonomous wheelchairs . The deployment of robots for public use has the advantage to enable a larger population to benefit from advances in automation. However, it introduces new hazards that may endanger our lives on a daily basis.
For many years, robots have been restricted to industrial settings. Safety was settled simply by stating that robots were not to interact with humans. It was, then, the robot’s operator’s responsibility to ensure his/her own well-being by following closely the safety guidelines. Accidents occurred mainly during line working, maintenance, and programming . With robots servicing in non-industrial settings, accidents may occur anytime during robot normal operation and may affect not just the robot’s operator, but also, bystanders.Footnote 1 The latter are people who simply happen to share the robot’s operating environment, who are not controlling the robot (and may have no knowledge of the robot’s functioning), nor using it, but may indirectly benefit from the robot’s services (e.g. a person walking in a train station, while a cleaning robot is executing its task). The typology of the robot’s operators has also changed, and it includes now lay-users, namely people who have received limited or no training for operating the robot. Therefore, new sources of hazards for robots servicing outside factories have emerged and new strategies for ensuring safety are needed.
To ensure that cohabiting with robots remains safe for humans, it is necessary: (a) to equip robots with sufficient sensing and control capabilities to be fully aware of their environment and to react adequately. We postulate that current robots deployed in public spaces do not satisfy this primary requirement; (b) to educate lay users of the potential dangers that robots create to avoid harmful situations. To our knowledge, no effort has yet been made in that direction to date.
Given the vast literature on potential dangers created by the use of autonomous cars and drones, see e.g. [8, 9], this article focuses the analysis on the dangers generated by robots meant to share the floor with pedestrians. We refer to a variety of mobile platforms, autonomous or semi-autonomous, ranging from humanoid robots, such as Pepper , self-driving vehicles, such as Starship personal delivery robots , to semi-autonomous devices such as Segways , also known as personal mobility devices or “rideables” (which includes e.g. power scooters, hoverboards, unicylcles, etc.) (Fig. 1). These robots are designed to provide different kinds of services, such as human assistance, patrolling, tour-guide, object delivery, human transportation, etc. [12, 13]. This new wave of robots has received little attention to date, surprisingly, considering that several of those robots are already a reality.
1.1 Measures of Risks Inherent to Collisions with Robots
Different metrics have been offered to estimate and quantify injuries deriving from human–robot collisions [14,15,16,17,18]. These works cover industrial robots and define clear criteria to minimize injuries that impact the robot’s shape, weight, velocity, and direction in which it may approach humans. Current data on injuries applies to robots with low masses (mostly up to 24 kg with some at 67 kg ) and limited speeds range (2–18 m/s). Unlike robot manipulators, mobile robots are heavier (from 20 to 250 kg) move slower (expected operational speeds up to 2 m/s), and may enter into contact with different body parts (lower limbs primarily). Therefore, injuries that may result from unexpected contacts between humans and robots may differ importantly from those documented in these works and it is not possible to extrapolate these data to cases beyond the tested scenarios, as described in .
The automobile industry has for long documented at length injury that may results from automobile crashes and documented these in a set of criteria, such as the head injury criteria (HIC) and neck injury criteria (NIJ). These criteria are also difficult to apply to smaller mobile robots, which usually operate at velocities one order of magnitude lower. It may however be incorrect to think that, because mobile robots are slower, only minor injuries may result from contact with pedestrians. Injuries differ from those incurred by the vehicle’s occupant, as the contact is direct—metal to skin. Moreover, as discussed in  and further studied in an empirical study with robot manipulators , incidents which would lead to minor injuries, according to acceleration-based measure (HIC), under the chest compression criteria (CC) (potentially lethal), would lead to serious injury. This observation was one of the motivations for the study in Haddadin et al. (2015), where experimental tests were conducted to determine injury criteria specific to robot manipulators and based on their shape, mass and velocity. The results of this indicate that injuries are unavoidable with robots whose masses exceed 16 kg and would suggest these robots should move to close to zero operational speed in the vicinity to humans .
The injury depends also on the type of fragility of the human the robot comes into contact. Documenting this in details is challenging as it requires sophisticated material.
In another study with mobile robot, collisions were performed using a child six year-old dummy  and with a 10 years old dummy . Results for masses of 80, 100, 200 kg and speeds of 0.55 and 1.66 m/s showed low injury probability at the level of abbreviated injury scale AIS 1 + (minor injuries, e.g., superficial laceration or single rib fracture) and AIS 2 + (moderate injury, e.g., moderate skull fracture), based on HIC, NIJ and CC results. All of these results call for injury criteria that fit more appropriately the specific objectives and applications of the robot to avoid minimizing or exaggerating risks.
It is relevant to also mention efforts conducted in different EU funded research projects that participate at developing safer navigation strategies for mobile robots. For instance, the STRADS project [22,23,24], is devoted to the issue of providing better predictive models of pedestrians’ movements in indoor environments. The SPENCER project  addresses the importance of embedding in the robot’s controller an understanding of social rules underlying crowd behaviour for enhancing mobile robots navigation (collision free navigation). The SPENCER project highlights the importance of ensuring that mobile robots be reliable, especially when they operate as assistive robots . However, the project does not make any explicit acknowledgement of the dangers (intended or unintended) that these robots might pose to users in different situations outside their specified application scenarios. The objective of the EUROPA project  is to develop urban navigation techniques for mobile robots, specifically designed for navigation in pedestrian area. The project focus on localization and tracking of obstacles. However, it does not address safety concerns in the case of improper detection or unexpected collisions in densely populated areas. Finally, the project ILIAD develops  a collision event pipeline for robots interacting with the environment, to respond to a series of collision event with application to. Nonetheless, further work is required to understand what could be a viable solution to react to collisions and to generate safe post-collision movements in unstructured open pedestrian environments.
This paper complements these efforts and aim to identify and discuss the hazards (i.e. physical and psychological) that may arise from having robots operate in crowded environments. Making an exhaustive list of the dangers generated by these robots is difficult as robots come in all sorts of shapes, in contrast to cars and drones, for lack of standards in their mechanical design. Our goal is not to offer a comprehensive list of these dangers but to bring an appraisal of the type of hazards that may arise from the fact that these robots navigate in close vicinity to humans, foremost to their operator (wheelchair/Segway user), but also to bystanders.. Mobile robots are heavier, taller and can move at a faster pace than pedestrians can. They are, hence, particularly dangerous for children, elderly, and people with limited mobility (Fig. 2).
Hazards from the use of these robots may become unavoidable as they travel in dense crowds. We hence take a close look at the dangers offered by robots navigating in human crowds, paying special attention to the new forms of interactions they may establish with humans.
To identify possible sources of dangers, we consider hazards deriving from unintentional physical contacts, which may occur as the result of accidents or errors (e.g. unexpected movements of the crowd or of the robot) and hazards deriving from intentional physical contacts, which may occur as the result of functional interactions (e.g. touching the robot for requesting its services). In addition to hazards arising from physical contacts, we also discuss psychological risks, arising mainly from the robot’s social capabilities conveyed by its appearance and behaviour.
Note that other concerns arise regarding the use of robots in public spaces. For instance, as the robot monitors its surroundings, it could breach the rules on data protection information (i.e. GDPR).Footnote 2 These are valid concerns, which, however, we will not cover in the present document.
The article is organised as follows: in the next section, we propose a set of case-studies based on selected scenarios showing potential sources of injuries for pedestrians. In Sect. 3, we present and discuss physical as well as psychological hazards. In Sect. 4, we reflect on policies needed to regulate mobile robots traffic.
2 Robots in Crowds: A Few Case Studies
Hazards deriving from interactions with robots cannot be generalized, but should be contextualized based on the characteristics of: (a) the operative environment, (b) the type of crowd, (c) the robots deployed and (d) the scenarios of use.
2.1 Type of Environment
A few example of challenging settings for mobile robots are transportation areas, such as train/metro stations, airports, and shopping malls, which are characterized by the presence of fixed and dynamic obstacles (e.g. including invisible obstacles such as glass doors and walls), inclines, and narrow regions. Distinguishing features of these environments are the high density of the crowd and unpredictability of the pedestrians’ motion: different flows moving in opposing directions; irregular flows with peaks during train unloading or flight arrival, flows moving at different paces (e.g. people in a hurry to catch the train, versus people moving slowly or waiting for another connection).
2.1.1 Type of Crowds
Crowds can be described according to several parameters: size (i.e. number of people in a place), density (i.e. the number of people per square meter), activity (i.e. moving or static), goals (i.e. identical for all vs specific for each individual), social relations (i.e. presence of groups, families, etc.), their environment (i.e.: outdoor, indoor, streets, corridor, mall etc.) (Fig. 3). Crowd can be characterized also by its profile, which can be casual, cohesive, expressive or anti-social . Agents in a crowd can be randomly assigned individual attributes, such as size, gender, age, luggage, walking speed, disabilities, and familiarity with the environment. Finally, groups of people can also be characterized by the type of context, normal activity or emergency (i.e. crowd erratic movement in a fire) and even by incorporating emotions into simulation models, such as stress, frustration or patience .
2.1.2 Type of Robots
Given the lack of homogeneity in the shape of mobile service robots, in the next scenarios of use, we narrow our analysis to three existing robots: a semi-autonomous powered wheelchair, the autonomous humanoid robot Pepper by SoftBank  and the runfun by Locomotec, a robot to support outdoor running .
We selected these three robots as representative of mobile ground robots used in public spaces since they share many important features with other existing mobile robots: autonomy versus semi-autonomy; humanoid versus machine-like appearance; social interactivity versus no social interactivity. They also involve different kinds of users: the wheelchair’s and runfun’s operator is in partial control of the robot and is the direct beneficiary of the robot’s service. The humanoid robot Pepper may have several beneficiaries (e.g. customers in a shop), none of whom has direct control over the robot’s movement. All three robots interact with a third category of people, namely passers-by and by-standers, who neither have control over the robot, nor direct benefit deriving from interaction with it. This bi- or tri-party interaction is hence key to the challenges faced when designing the controller of the robots. Issues arise when one stakeholder’s benefits conflicts with another stakeholder’s benefits or when trade-off have to be found in order to ensure fluidity of traffic and efficiency of robot’s task completion. We exemplify these bi-party and tri-party conflicts in a few scenarios next. The wheelchair exemplifies issues and ethical dilemmas shared with other robots designed for humans’ transportation and partially operated by their user. The humanoid robot exemplifies issues on social norms arising from the physical appearance and behaviour of the robot, while the runfun illustrates issues on ascription responsibility arising from improper use of the platform.
2.1.3 Scenarios of Use
Consider a semi-autonomous powered wheelchair driven by a person with reduced mobility, but intact perception and cognition, engaged in the crowded departure area of an airport. As transporting a luggage on the wheelchair can be problematic, a humanoid robot Pepper is following carrying the luggage. The humanoid robot is fully autonomous but can interact with humans through speech synthesis and vision (with on-board camera and face/gesture/facial expression recognition system). The powered wheelchair is operated in a semi-autonomous mode, namely the user’s inputs are combined with the robot’s inputs coming from its sensors to assist during driving. In semi-autonomous mode, if the driver stops providing inputs, the wheelchair stops. Otherwise, the wheelchair regulates its velocity and autonomously steers away from obstacles. In case of contact with passers-by, the wheelchair will decrease its velocity and react in a compliant way. The humanoid robot keeps track of the wheelchair through its on-board sensors (vision, sonar) and communicates with the wheelchair through wireless. Wheelchair and humanoid robot must remain within two meters from each other. Failing this, communication will be lost, and the wheelchair driver will have to track back the humanoid robot. In the last scenario, we will use a different robot, the runfun, which is designed for outdoor activities.
Scenario 1: The wheelchair-humanoid team must move steadily to the gate as it is closing. Suddenly, a disorganized crowd of teens comes moving in the opposite direction. Soon, the wheelchair and the humanoid start manoeuvring their way through the teens crowd. A few teens interested in the cute appearance of the humanoid come face to face to the robot, willing to touch it (Fig. 4). To ensure no harm for the teens, the robot’s controller should stop to avoid contact, but this would result in it losing track of the wheelchair, and in the wheelchair-humanoid team to miss the plane. As the humanoid robot is a well programmed type of robot, it puts the risk of harming the human above the risk of not completing its task and hence stops.
The physical appearance of the robot may elicit different reactions in people (e.g. too cute and too friendly a robot may become an impediment). Moreover, the size and weight of the robot must be small enough to limit physical risks—if the robot were to fall over (Fig. 5)—as well as psychological risks—such as those related to the robot dimension, which could cause fear and distress in users and pedestrians, impacting on the perception of safety. Such psychological hazards are discussed in more detail in Sect. 3.2.
Scenario 2: The wheelchair-humanoid team moves in a column formation within one meter from one another to leave some distance in case the front robot should stop suddenly. As they travel their way through the airport, time-pressured passers-by keep breaking the formation and small groups of still people keep forming in front of the robots. Following its built-in social rules, the humanoid gently touches with its hands the shoulder of a person blocking its way and asks her to let it go through so as to complete its task. Unfortunately, this social feature causes an unexpected reaction. The person touched turns abruptly and collides with the robot, eventually setting it off-balance. The humanoid’s control system activates the emergency response in case of a fall and orders the two robot’s arms forward to rebalance the robot. Unfortunately, as it moves its arms, the robot hits a passer-by. The compliant controller is instantaneously activated and the arm bounces off leaving the passer-by unharmed. This however sends a strong opposite torque to the robot, letting it spin and crash onto the wheelchair. Upon detecting the humanoid about to crash on its user, the wheelchair computes rapidly a free trajectory and jerks forward. As it does so, it successfully avoids the humanoid but rolls onto one a frail elderly’s foot and catches another person’s coat. The humanoid ends up crashing on the floor. The coat is destroyed and the human injury is serious because of the age of the bystander. Such and other unintentional contacts are discussed in Sect. 3.1.1. Note that the above scenario may also lead to minor injury if, in place of an elderly, the robot was to fall on a pedestrian wearing ski boots protecting his/her feet. The degree to which a hazard may lead to a serious injury depends hence on the fragility of the pedestrians involved in the accident (e.g. whether s/he is a male or female, adult, child or an elderly person, a pregnant woman or any other person with a special health condition).
Scenario 3: A passer-by in a hurry and load with baggage has to cross a crowded juncture in the airport hall, where many Pepper robots are navigating to go back to their charging stations. The Pepper robots have been implemented with a socially-aware navigation algorithm. Moreover, they are endowed with a transparency interface system that allows pedestrians to understand the robots’ intentions. Thanks to the transparency system, the pedestrian can safely cross the juncture without losing any time since she knows: robots will stop at a crossroad; why: because they detected her and not something else; for how long they will wait (she has time to cross with her baggage); and what they will do next (robots will not go her way but in another direction). We will expand on the need to provide social awareness in the control system in Sect. 3.2.2.
Scenario 4: The semi-autonomous wheelchair is moving in a densely crowded environment. The robot should regulate its behaviour to avoid to “freeze” in the middle of the traffic . Indeed, as its car counterpart, stopping in the middle of the traffic is ill advised as it may result in more harm. The wheelchair can adapt its speed and movements to the crowd flow (see Fig. 6 left). Indeed, stopping or moving too slowly may lead to harmful events (see Fig. 6 right). Moreover, rapid changes in navigation plan and compliance to social rules are required to navigate safely around obstacles and minimized harm upon contact. The controller must hence adapt in accordance with unspoken social rules regulating distances between pedestrians and flow of motion within (well-behaved) crowds.
Scenario 5: A runfun robot leading the way to its user inadvertently catches a passer-by handbag, breaking its lace (Fig. 7). The owner requests the runfun’s user to pay for the damage. The user turns back to the producer that declines covering the damage, arguing that it was the user’s responsibility to adapt his speed when approaching other pedestrians which would have led runfun to reduce its speed in turn, all of which was clearly explained in the user manual. In crowded environments, the robot’s design matters and it is important to avoid sharp edges, hooks or protruding parts.
3 Source of Hazards
In this section, drawing on the findings available in the literature on human robot interaction, mobile robots navigation, and risk assessment and safety of personal care robots, we identify and describe hazards emerging from interaction between humans and robots.
In line with Lasota, Fong, and Shah, we consider safety in human–robot interaction as given by the elimination or mitigation of hazards deriving from physical as well as psychological harms . Hence, in the next sections, we distinguish between physical and psychological hazards.
3.1 Hazard Deriving from Physical Contact
Physical contact occurs when the human and the robot bodies (any part of the robot external surface: hands or wheels) come into contact with each other. ISO 13482:2014 defines contact as: ‘zero distance between robot and an object in its external environment’ . Several occasions may lead robot and human to come into physical contact and such contacts may generate hazards with different degrees of severity. The most frequent causes of contact are accidents or malfunctions, with often dangerous consequences for humans. However, there might also be less dangerous forms of contact, such as gentle touches, taps, pats, hands shake, and even hugs [35,36,37], which are usually deployed in “social interactions”. Other forms of bodily contact may happen, for instance, during assistive tasks, such as with robots designed as walking support [38, 39] or mobility devices [40, 41].
Most work in the area of safety in physical human–robot interaction (pHRI) [16, 19, 28, 42] including ISO/TS15066:2016 , conclude in limiting the operational velocity of a robot or robot link around a human based on a specified limit of permitted force in case of impact. The ISO/TS 15066:2016 standard defines a set of measures and protocols to establish risks presented by industrial robots operating in collaborative environments with trained operators. It specifies the inertia and velocity operation limits and recommend using “pain thresholds” measured in force or pressure and fix limits on these quantities. However, determining safety measure via pain thresholds may not prevent injuries in specific scenarios as showed in .
For instance, in the face of a blunt impact with a robot manipulator, one could use similar methodology for understanding moderate and minor injuries, for constrained and unconstrained collisions with mobile robots. Subsequent work by Mansfeld et al., presents an excellent compendium of biomechanical injury data and a method of classifying robot manipulator’s tasks and operation mode by dividing them based on the level of injury it could produce in case of collisions. This work gives a useful insight on how to map potential risks over the human body based on operational modes of the robot .
Nonetheless, for constructing similar safety maps for mobile robots operating in human unstructured environments there is still further work required: firstly, obtaining reliable data of the biomechanical effects over the human body in unconstrained collisions with robots in the operational space (weight and velocities) that service robots are expected to work. Secondly, characterising the human body with respect to the physical attributes of the person involved in the accident: gender, age, health conditions, etc. Indeed, vulnerable people, such as children, elderlies and pregnant women could be particularly at risk of serious injuries in the event of a collision with a robot. Thirdly, designing the control strategies that account for multiple scenario-based situations in terms of safety operational modes, e.g., situations where given a density of pedestrians it would be less dangerous to softly collide and push a person ahead rather than stop and risk a large crowd to collide.
The work by Haddadin, De Luca, and Albu-Schäffer could be useful as a starting point to analyse the mobile robots’ control and behaviour in a similar pipeline of collision events. From pre-collision (planning actions), followed by detection, isolation, identification, classification, reaction, and post-collision response . Where current state of the art has focused mostly in the first phase only (pre-collision) [22,23,24,25, 27]. While detection, isolation, and identification of collisions for mobile robots is still a complex problem addressed in few works [43, 44]. Moreover, reaction and post-collision response is yet very unexplored .
Physical contact between mobile robots and humans is generally considered as dangerous and disturbing, and for these reasons, unacceptable . Hence, most of navigation algorithms for mobile robots are designed for safe collision avoidance [47, 48]. However, although safe and endowed with a socially “aware” navigation (e.g. social norms and proxemics rules implemented in the robot behaviour) , collision avoidance algorithms are not the most efficient solution from a functional standpoint, especially for robots operating in crowded spaces. Indeed, in an environment jammed with people, the robot could find itself always stuck (i.e. “the freezing robot problem”) due to the lack of collision-free path , see also an illustration of the freezing robot problem in scenario no. 4. The consequences of being stuck are delays in the robot accomplishment of its service or, in worst cases, the impossibility to carry out its task (e.g. delivering an object), as illustrated in Scenario 1. Moreover, as shown in Fig. 2, a robot unable to move becomes an obstacle for other people or other robots, hence compromising safety and negatively affecting traffic fluidity in congested areas.
Among the solutions proposed by scholars for increasing robot safety and efficiency in crowded or cluttered situations is the possibility of accepting physical contacts with objects  and also with humans, for instance by developing cooperative navigation models [52, 53].
In the study by Shrestha et al., the authors propose a solution for improving navigation and safety, which is based on intentional physical contacts between the robot and the human. They consider the possibility of inducing a person to move aside by means of physical contact. The study investigates whether it is possible to predict reactions in by-standers as a consequence of a robot touch. By correlating different contact points (i.e. upper arm, lower arm, upper back and lower back), with force and human responses direction, the experimental results confirm that human reaction is quite consistent with the direction of contact force when static human subjects are touched by a robot arm . In another study, intentional physical contact is used to buffer very probable contacts during navigation . The authors propose a preliminary demonstration of a novel control framework for forearm contact during robot navigation. In the paper, a mobile robot employs the forearm to rub against the human body in order to accomplish its navigation goal during space-constrained navigation scenarios.
Indeed, the possibility of accepting and using physical contact with pedestrians or objects replicates a frequent habit among humans. Contacts among pedestrians are frequent and accepted in highly crowded and cluttered environments. Therefore, rather than moving to minimize the risk of contacts (which may be more disruptive to the crowd), these robots could be designed to accept contacts while minimizing the risk of injuries. Safe contact can be generated through compliant behaviours  and by using appropriate shape and materials, for instance, by covering with soft material the robot, by avoiding sharp edges and protruding parts, or by reducing the possibility of clutching into the robot joints, or getting caught in the robot structure and be dragged.
In Table 1, we propose a preliminary list of hazards emerging from physical interactions.
Each type of physical contacts listed in the table can be the result of intentional or unintentional actions. Moreover, physical contacts can be further characterised according to the initiator of the action: (1) robot-to-human when contact is initiated by the robot; (2) human-to-robot when contact is initiated by the person; and finally, (3) cooperatively-initiated contact, which occurs when both human and robot play an active role in establishing a contact (e.g. object exchange or handshaking) .
In this paper, we will focus on robot-to-human and human-to-robot contacts. Moreover, we will not take into account physical contacts deriving from the use of control interfaces, such as joysticks or buttons. As a matter of fact, the safety of the person who operates the robot has received substantial attention in several safety standards, such as  and . On the contrary, we wish to highlight safety concerns with respect to pedestrians.
In the next sub-sections, we discuss the following hazards items:
unintentional physical contacts;
intentional physical contacts (robot to human);
intentional physical contacts (human to robot).
Drawing on the list of hazards provided in , Table 2 summarises the results of a preliminary identification analysis of psychological hazards. The list is not exhaustive and it covers only hazards deriving from interactions between pedestrians (i.e. passers-by or by-standers) and robots.
3.1.1 Unintentional Physical Contacts
Physical contacts are “unintentional” when independent from the robot or user’s intentions. Unintentional contacts take place accidentally, either because the robot or the human pass the “point of no escape”, making contact inevitable.
The most striking evidence that collisions between human and robots can happen is the first pedestrian death associated with self-driving technology .
According to ISO 12100:2010 unintended contacts can be further divided into: “dynamic” contact, if the person can retract after the impact and “quasi-static” contact, if the contact is prolonged . This distinction is tailored to collaborative industrial robots, where, after a collision occurred, the escape way can be minimal; hence, the risk for the worker increases when he/she is trapped between the robot and a near obstacle. In a public environment, such as an airport, the risk of quasi static contact is less controllable with respect to a workspace as well as the risk deriving from sudden movements of pedestrians.
For instance, if a person makes a sudden move without leaving enough time to the robot to swerve, depending on the situation, the robot could hit the person, crush the person against an object or rolls over (crush) a person’s foot. Other unintentional contact may arise from catching pieces of clothing, as illustrated in the scenario no. 2.
To avoid such contact, it is crucial to endow the robot with full sensor coverage of its environment. Figure 8 top illustrates the visual coverage resulting from a typical choice of sensor placement, with a LiDAR on top and proximity sensors at the bottom of the platform. Because of the narrow vertical field of view of these sensors, the wheelchair has large blind areas.
3.1.2 Intentional Physical Contacts (Robot to Human)
Intentional, robot-to-human physical contacts can be either active or passive: the former are physical contacts aimed at inducing a reaction in a human, for instance when a robot is stuck in front of a group of people and need to move on, it may touch with its hand the back of a person . Other forms of active intentional contacts can take place during social interactions: hands shaking, offering the arm as walking support, giving the five. The latter type refers to contacts occurring when no reaction from a human is sought, for instance when a robot intentionally swipes against a person because it evaluates this action as the best solution in order to ensure safety during navigation , as described in scenario no. 2.
Among the most challenging technical problems related to intentionality is the problem of distinguishing between intended and unintended physical contacts. Kouris and colleagues take into account the problem of differentiating unexpected collisions from voluntary contacts during human–robot collaboration, in order to improve the operator’s safety . The authors use frequency characteristics of cooperation and collisions forces, which are measured by the robot’s proprioceptive sensors. The most challenging aspects of classification are: to achieve a reliable detection of collision; to eliminate false positive; and to ensure very short reaction times.
Besides safety implications, there are also non-technological challenges related to societal acceptability, which need careful sociological, but also ethical and legal considerations (e.g. would a person—child or adult—consent to be touched by a robot?). For space constraints we cannot address these aspects in detail. We refer the reader to .
3.1.3 Intentional Physical Contacts (Human to Robot)
Human-to-robot, intentional contacts are deliberate contacts made by a person to induce a behaviour in the robot (e.g. controlling the robot speed) These types of contacts are targeted at specific parts of the robot body, where touch is permitted by the designer. These parts may vary depending on the robot morphology (e.g. wrist, back of the hand, shoulder). Indeed, many robots, like Pepper, are endowed with tactile body parts, consisting of capacitive sensors. Tactile sensors are often used to detect human contact within HRI applications . They allow the robot to “feel” touch and to identify the part of the body touched.
Intentional contacts can be particularly useful in social interactions with robots. For instance, a gentle touch on the arm can stop the robot; a tap on the shoulder can reduce the robot speed during a guiding task; touching another part of the robot (i.e. the tablet screen) can be used to request its services. However, unless characterized by strong affordances, the use of tactile body parts implies that the user knows in advance which parts of the robot can be activated and which actions are triggered once activated.
The lack of preliminary information on how to use the robot (e.g. tutorials or user manuals) as well as the lack of appropriate feed-backs to users, may lead to misuse and to dangerous situations .
Moreover, touch can be the consequence of an emotional response to the robot appearance or behaviour (a hug, a caress, a simple touch) as described in the scenario no. 1.
A different case is when intentional contact is aimed at damaging or destroying the robot: i.e. vandalism . In order to guarantee the security of the robot and consequently the safety of all its stakeholders, designers should take seriously into account robot vulnerability with respect to abusive physical interactions .
3.2 Hazards Deriving from Psychological Interactions
Psychological safety is concerned with the elimination or mitigation of hazards deriving from psychological harms, such as discomfort or stress. According to Lasota, Fong, and Shah ‘maintaining psychological safety involves ensuring that the human perceives interaction with the robot as safe, and that interaction does not lead to any psychological discomfort or stress as a result of the robot’s motion, appearance, embodiment, gaze, speech, posture, social conduct, or any other attribute’ .
In order to ensure psychological safety, the authors suggest adjusting the robot behaviour and appearance to the personality traits, experience and culture of the human user . This can be done, for instance, by modifying the robot speed, acceleration, or proxemics, by balancing the level of anthropomorphism in the external appearance, or by implementing in the robot behaviour the social conventions commonly used in human-to-human interactions, such as turn-taking, eye-contact, or giving the right of way during navigation. The solutions proposed are still tailored on the subjective characteristics of the individual user, therefore they may not be easily applicable to robots operating in crowded spaces, where the focus is not exclusively on users, but also on pedestrians, that is, people who just share the robot floor, like passers-by .
In addition to behaviour and appearance, psychological hazards can be brought about also by physical contacts between the human and the robot, which may become more frequent also in service applications. Indeed, physical contact is becoming a popular interface in human robot interaction. However, people’s reactions to robot touch can be different, depending on the attitude (i.e. positive or negative) towards the robot .
In robotics, psychological hazards have been mainly addressed in the framework of occupational safety , with respect to industrial and collaborative robots. Among the main psychological hazards for workers’ health, there are mental stress and cognitive burden deriving from interactions with robots .
British Standard 8611:2016 is the only soft law instrument explicitly addressing psychological harms in human–robot interactions, outside of industrial applications . It defines psychological risks as ethical harms, namely ‘anything likely to compromise psychological and/or societal and environmental well-being’.Footnote 3 This standard expands the range of psychological hazards, so far mainly focused on stress and anxiety, by adding to the list: ‘embarrassment, addiction, deception, humiliation, being disregarded’ .
In 2017, the European Parliament published the recommendations to the Commission on the use of robots, where, among the various dangers identified, a new psychological hazards was given special attention: ‘the possible development of an emotional connection between humans and robots ‒ particularly in vulnerable groups (children, the elderly and people with disabilities) ‒ and the issues raised by the serious emotional or physical impact that this emotional attachment could have on humans’ . The risk of developing affective bonds with robots had already been addressed in the scientific literature [74,75,76]. Although this is a risk concerning in particular social robots, all robots servicing in public spaces can exhibit some degree of anthropomorphism, either in their appearance or behaviour, given their need to adapt to social norms during navigation and to make interactions with humans as easy as possible. Some of the psychological harms that can be caused by the development of emotional connections with robots have been so far investigated in the literature concerning companion robots and their interaction with vulnerable people (i.e. elderly and children) and are: subconscious engagement, dependence, and social isolation [77,78,79,80] just to mention a few.
To sum up, for psychological hazards, we mean the (serious) deterioration of people’s mental health as a consequence of interactions with robotic devices. In particular, psychological hazards may affect a person’s cognitive, social and even emotional capabilities. Psychological hazards can be caused by robot movement, appearance and social capabilities, including forms of physical contact. Potential consequences on health are stress, anxiety, discomfort, fear, emotional connects, etc.
In Table 3, we propose a preliminary list of psychological hazards emerging from interactions with humans.
In the next sub-sections, we discuss the following hazards items:
Hazards deriving from robot appearance
Hazards deriving from robot motion
Hazards deriving from physical contact
Drawing on the list of hazards provided in , Table 4 summarises the results of a preliminary identification analysis of psychological hazards. The list is not exhaustive and it covers only hazards deriving from interactions between pedestrians (i.e. passers-by or by-standers) and robots.
3.2.1 Robot Appearance
Discomfort can be caused merely by the robot presence, in particular with special categories of people, such as children, elderly, people with cognitive disabilities . The reasons can be many: cultural, religious, and very subjective: personal attitude towards technology, level of familiarity with technologies. For instance, people may feel uncomfortable because of the presence of cameras and hear phones on board the robot. The feeling of being watched by the robot and potentially by other people could be the cause of apprehension. We refer to this hazard as the feeling of being under surveillance or spied on, which may cause stress, anxiety and in some case even violent reactions towards the robot [92, 93].
A robot appearance is a sensitive item, which must be designed carefully in order to avoid different types of hazards. On the one hand, a robot appearance may evoke eerie feelings in the beholder, linked to nervousness or fear. The Uncanny Valley theory is a case in point [85, 94]. This is a phenomenon experienced by people confronted with robots similar to living humans or animals. The level of pleasantness and familiarity experienced by humans grow in a measure directly proportional to the realism of the robot up to a point in which there is a sudden fall of the positive emotional responses.
On the other hand, the level of realism in a robot appearance (anthropomorphism or zoomorphism) can be a matter of concern due to the feeling of social presence  generated in the beholder, which could lead to the humanization of the robot, hence the development of forms of emotional attachment towards it , especially by vulnerable people, such as children, elderlies and disables (see scenario no. 1).
Moreover, the robot shape plays a determinant role in the communication of the robot level of perceived safety . The perception of safety is given by physical attributes such as the robot dimension, shape, balance, etc. For instance, the presence of sharp edges or protruding parts in a robot cover may elicit a feeling of danger or fear in the user or passer-by, for the risk they may create (as illustrated in scenario no.5). Similar fears can be triggered from other features, such as the robot’s size as compared to the pedestrian (robot to child for instance), as illustrated in scenario no. 1).
Finally, the robot appearance is related to the design of affordance and transparency. The former is concerned with the understanding of the robot function and usage. For instance, a low level of affordance can cause errors and confusion in the user, increasing cognitive burden and consequently stress and anxiety [96, 97] (for an example see scenario no. 3). The latter has to do with the presence of interfaces that allows a person to understand what the robot is doing and why [63, 96, 98]. For instance, a low level of transparency can cause misunderstanding in pedestrians due to the poor legibility of the robot intentions, negatively affecting human–robot interaction. The lack of transparency besides being stressful for users and pedestrians can have dangerous consequences (see example in Sect. 2.1, scenario 3).
3.2.2 Robot Movements
Robots movements are usually considered dangerous because of the physical harms they can cause: e.g. crushing, collision, cuts, abrasions, etc. . However, in this sub-sub-section, we are looking at the psychological hazards caused by robot movements to users and pedestrians.
Besides appearance, movement is another factor determining the Uncanny Valley. Indeed, increasing the realism of the robot movements does not necessarily imply obtaining higher acceptance from users. For instance, a robot moving not as smoothly as a human being may be considered as fearsome and elicit unpleasant, disturbing feelings, causing stress and anxiety . Among the possible causes investigated in the literature are the fear of losing bodily control  or the mismatch with our expectation caused by the incongruent robot movement .
The perception of a robot safety is also depending on movements. Motion may increase trust in the robot, for instance when it actively avoids obstacles and shows a reliable safety behaviour (e.g. stops or slows down in front of a human, turns away, act compliantly upon contact, namely is perceived as a free mass). On the contrary, motion can decrease trust because the robot is perceived as unstable or too fast. In  the continuous forward and backward movement of an autonomous Segway platform in still position provoked in the robot users a feeling of instability and weakened the perceived safety of the robot during interaction.
We argue that the lack of compliance with social rules during navigation, can be considered as a source of physical as well as psychological hazard. The importance of socially-aware navigation is nowadays very much consolidated in the robotics literature [101,102,103,104]. As a matter of fact, in interpersonal interactions, people tend to respect personal space; move to one side of hallways; yield right-of-way. This is made possible by culturally shared conventions and by the capability of interpreting human communication (e.g. movements, gestures, eye-contact, etc.). In human robot interaction, research demonstrates that by implementing socially aware navigation improves robot performance and safety (by reducing stress and discomfort during navigation), and as illustrated in scenario no. 3. For instance, a robot non respecting personal space can cause discomfort in pedestrians [105, 106].
Nevertheless, not all social rules in use in interpersonal relations can be equally applicable and effective in human–robot interaction. For instance, while we accept to walk next to an unknown person, we may not feel the same with a robot. In the work by Hanajima et al., the authors investigated human subjects’ responses to approach motions of a mobile robot by varying speed and proximity (slow/fast and close/far away). Drawing on psychophysiological analysis (i.e. electro dermal activity and semantic difference technique), the study showed that a robot moving close by to a person increases the level of anxiety (independently from its speed) compared to a robot moving at a greater distance .
In another study Pham et al., the authors considered the psychological effects of a robot movement on pedestrians, in terms of their feelings of safety and comfort. They investigated the effects of a personal mobility vehicle (i.e. an overboard) in pedestrian flows using the concept of personal space, which is the space in which invasion by others induces a psychological strain. The authors found out that the pedestrians’ level of fear and discomfort toward the personal mobility vehicle increased when the pedestrian density increased .
3.2.3 Physical Contact
Touch is becoming a key form of interaction between humans and robots also outside of the industrial sectors, where, since a decade or so, collaborative robots have removed the barriers separating workers and machines .
As discussed in Sect. 3.1, in crowded environments, physical contact between pedestrians and robots can be inevitable. Therefore, endowing robots with the ability to recognize contacts with electronic skin and use touch to communicate with humans can be important for improving safety (by integrating vision based approaches)  as well as facilitating social interactions , or improving trust in human–robot collaboration .
However, the experience of being touch by a robot or just the idea of it can have different levels of acceptability and its use may not be as effective among humans . In the study by Shrestha et al., the authors investigated the participant’s subjective response towards robot-initiated touch during navigation. Results show that prior experience with robots produces slightly better response from humans and that verbal warning prior to contact, yields much more favourable responses. The authors state that in general, participants in the study did not find contact to be uncomfortable and were not opposed to robot-initiated contact if deemed necessary .
As far as the effectiveness and acceptability of contact/tactile interactions, Willemse, Toet and van Erp investigated the equation of interpersonal touch and human–robot touch (HRT), namely whether robot-initiated touches induce physiological, emotional, and behavioural responses similar to those reported for human touches. According to the authors, ‘merely simulating a human touching action with the robot’s limbs is insufficient to elicit physiological, emotional, and behavioural responses in this specific context and with this amount of participants’ . Finally, the authors point out that in order to evaluate the effectiveness of a robot touch, it is necessary to take into account ‘the robot’s touching behaviour, its appearance and behaviour, the user’s personality, the body location where the touch is applied, and the (social) context of the interaction’ .
In a study by Arnold and Scheutz, the authors confirm that a robot touch enhances the social appraisal of a robot as a worker and teammate. The authors point out the critical importance of gender role and workplace expectations in the evaluation of a robot touch . According to the findings of Hoffmann the acceptance of touch may depend on robot shape: small and pet like robot are more favourably accepted, since we tend to associate them with inoffensive and baby creatures. Quite interestingly, higher acceptance was revealed for functional compared to affective touch. Cultural norms play an important role in the assessment of being touch by a robot: as well as the context: legal/illegal; pleasant/unpleasant; or positive/negative to which we should add safe/unsafe .
This paper offered an overview of the hazards entailed by the introduction of robots moving in densely populated, pedestrian areas. These robots will present primarily risks for human physical safety. Because they travel in close vicinity to pedestrians, unlike cars, they are highly likely to enter in physical contact with humans. As these robots are heavy and can travel faster than humans, contact may generate strong forces and be particularly detrimental to population with slow mobility and for children.
Implementing human aware navigation algorithms—including social norms and proxemics rules—as well as endowing robots with transparency interfaces for disclosing robot actions (current and future)—can contribute to reducing the hazards inherent to deployment in public spaces. Examples of implementation can be found in [120, 121].
In addition to physical risks, in this study we pointed out the need to consider also psychological risks. We identified a preliminary set of hazards affecting the mental health of people during interaction with robots. Again, specific methods and tools are needed to measure the severity of psychological hazards.
Robots servicing in public spaces introduce new target hazards, namely pedestrians. The severity of hazards (physical as well as psychological) depends on the characteristics of the human subject involved in the impact (e.g. gender, age and health). To be completely safe, a robot servicing in public and crowded environments must be designed taking into account such a diversity of features. In other words, designers should be aware that a human obstacle is not just a male, adult, but can be also a child, an elderly person, a pregnant woman or any other person with a special health condition. Besides robot designers, we believe that it is now important to inform the general public on the dangers generated by the new robots.
Finally, as artificial intelligences advance and autonomous robots in their many forms perform more and more tasks, not only will individual robots be in increasingly dense crowds (increasing density of urban areas), but these robots will themselves start proliferating and may outnumber us in our public spaces. While this nightmarish thought may hopefully never come to be a reality, it is possible that in certain circumstances robots may punctually outnumber humans. For instance, in industrial environments, autonomous transporters will gradually replace human-driven ones and hence it may become quite frequent in the factory corridors, using a large part of the factory floor. This may also become true in pedestrian walks particularly suited for powered strollers and other automated personal transporters.
This manuscript did not discuss dangers arising from the deployment of other autonomous robots such as autonomous cars and drones, as our focus was to highlight dangers arising from designing robots meant to navigate in pedestrian alleys. However, these other two categories of autonomous vehicles also present many dangers to pedestrians.
While, in principle, cars are not allowed to drive on pedestrian alleys, there is an increasing tendency to allow cars to share the same floor as pedestrian (e.g. in city centres of some European cities). Like ground robots, autonomous cars have autonomy of Level 3–4 of SAE standard. Collisions with autonomous cars are even more detrimental to pedestrians since cars are one order magnitude heavier than the small mobile robots we considered here. A car is, however, much more visible than an autonomous mobile robot (e.g. a delivery robot) or a wheelchair in a crowd. Moreover, people are much more acquainted with dangers represented by cars, than they are with mobile robots.
Drones share many features with ground mobile robots (full autonomy, no operator on-board). If home delivery drone market were really to take off, we may start encountering flocks of drones around doorsteps, and, for this transient period where they fly at human height, drones may lead to hazards for bystanders. These hazards are, however, different from those generated by robots on ground. Drones move at much higher speed than ground robots and are barely visible to pedestrians since they usually fly above heads. The dangers that drones may create to pedestrians one day should, hence, not to be neglected. However, as for autonomous cars, these issues deserve a dedicated report.
While autonomous cars have their own space (the roadway), robots servicing in public spaces will share the space of people (i.e. pedestrians). To limit their misuse and undue proliferation may require public policy. Should we think ahead and start changing the design of our pedestrian walks, dividing these (as we do with swimming lanes) into fast moving lanes, permitted to robots, and slow-moving lanes restricted to protected pedestrians (elderly, families with small children, etc.). Should we start monitoring speed limits of robots and install fines for not respecting lane usage and speed recommendations?
Limiting the number of robots to prevent undue proliferation may not be an easy issue, especially in free market societies. It becomes an issue of public policy to determine the right trade-off benefits and harms brought by robots in public space. Perhaps some robots may be prioritized over others (e.g., robotic wheelchairs vs. delivery robots) because of greater human benefits (consequences), or because the “right” for a human’s autonomous locomotion in a robotic wheelchair trumps a faster delivery time for receiving a consumer product.
Public awareness is also necessary to start a conversation about public policies. New policies are required to regulate mobile robots deployment in public areas, including task, speed of motion and to enforce adequate training of robot’s end-users, when necessary.
In this article, the terms pedestrians, bystanders and passers-by are used to loosely define laypersons of any age, gender, physical and cognitive conditions who share the same vicinity with a mobile service robot.
General Data Protection Regulation. The regulation is applicable as of May 25th, 2018 in all member states to harmonize data privacy laws across Europe.
In this study, we prefer to use the phrase psychological hazards instead of ethical harms, as proposed in . In our opinion, ethics and safety are two different concerns. Safety is concerned with the reduction or mitigation of hazards, physical as well as psychological. Moreover, safety is an ethical value, likewise privacy, dignity, freedom, etc. Ethics is the art of making the best choice when a trade-off among values is needed.
Hussain R, Zeadally S (2018) Autonomous cars: research results, issues and future challenges. In: IEEE communications surveys and tutorials. https://doi.org/10.1109/COMST.2018.2869360
Ross PE (2018) Iceland’s consumers try drone delivery: the startup Aha takes on Amazon with basic drones bearing burgers—[News]. IEEE Spectr 55(10):12–13
Hawkins AJ (2019) Thousands of autonomous delivery robots are about to descend on US college campuses. THE VERGE (Accessed December, 2019). https://www.theverge.com/2019/8/20/20812184/starship-delivery-robot-expansion-college-campus
Wang S, Christensen HI (2018) TritonBot: first lessons learned from deployment of a long-term autonomy tour guide robot. In: Proceedings of the 27th IEEE international symposium on robot and human interactive communication, Nanjing, China, August 27–31, 2018
Scudellari M (2017) Lidar-equipped autonomous wheelchairs roll out in Singapore and Japan. IEEE Spectrum, 22 Sept 2017 | 15:00 GMT. Available Online https://spectrum.ieee.org/transportation/self-driving/lidar-equipped-autonomous-wheelchairs-roll-out-in-singapore-and-japan. Accessed October 31, 2018
Jiang BC, Gainer CA (1987) A cause-and-effect analysis of robot accidents. J Occup Acc 9(1):27–45. https://doi.org/10.1016/0376-6349(87)90023-X
Stanciu SC, Eby DW, Molnar LJ, Zanier N, Kostyniuk LP (2018) Pedestrians/bicyclists and autonomous vehicles: how will they communicate? Transp Res Rec. https://doi.org/10.1177/0361198118777091
Plioutsias A, Karanikas N, Chatzimihailidou MM (2018) Hazard analysis and safety requirements for small drone operations: to what extent do popular drones embed safety? Risk Anal 38:562–584. https://doi.org/10.1111/risa.12867
Pandey AK, Gelin R (2018) Pepper: the first machine of its kind. A mass-produced sociable humanoid robot. IEEE Robot Autom Mag 25:40–48
Ivanov SH, Webster C, Berezina K (2017) Adoption of robots and service automation by tourism and hospitality companies. Rev Turismo Desenvol 27(28):1501–1517
Haddadin S, Albu-Schäffer A, Hirzinger G (2007) Safety evaluation of physical human–robot interaction via crash-testing. In: Proceedings of robotics: science and systems RSS (pp 217–224)
Haddadin S, Khoury A, Rokahr T, Parusel S, Burgkart R et al (2012) On making robots understand safety: embedding injury knowledge into control. Int J Robot Res 31(13):1578–1602
Mansfeld N, Hamad M, Becker M, Marin AG, Haddadin S (2018) Safety map: A unified representation for biomechanics impact data and robot instantaneous dynamic properties. IEEE Robot Autom Lett 3(3):1880–1887
ISO/TS 15066:2016 (2016) Robots and robotic devices—collaborative robots. ISO, Geneva, Switzerland
Park MY, Han D, Lim JH, Shin MK, Han YR, Kim DH et al (2019) Assessment of pressure pain thresholds in collisions with collaborative robots. PLoS ONE 14(5):1–12. https://doi.org/10.1371/journal.pone.0215890
Haddadin S, Albu-SchäCurrency Signffer A, Hirzinger G (2009) Requirements for safe robots: measurements, analysis and new insights. Int J Robot Res 28(11–12):1507–1527. https://doi.org/10.1177/0278364909343970
Fujikawa T, Kubota M, Yamada Y, Ikeda H (2013) Estimating child collision injury based on automotive accident data for risk assessment of mobile robots. In: IEEE international conference on intelligent robots and systems (pp 2902–2907)
Kim HY, Park JH, Yun S, Moon S, Gwak KW (2017) Preliminary experimental results for chest compression in mobile robot–human unconstrained-collision. In: International conference on control, automation and systems, vol 2017, pp 893–895. https://doi.org/10.23919/ICCAS.2017.8204352
Fentanes JP, Lacerda B, Krajnik T, Hawes N, Hanheide M (2015) Now or later? Predicting and maximising success of navigation actions from long-term experience. In: Proceedings—IEEE international conference on robotics and automation, pp 1112–1117. https://doi.org/10.1109/ICRA.2015.7139315
Dondrup C, Hanheide M (2016) Qualitative constraints for human-aware robot navigation using Velocity Costmaps. In: 25th IEEE international symposium on robot and human interactive communication, RO-MAN 2016, pp 586–592. https://doi.org/10.1109/ROMAN.2016.7745177
Krajnik T, Fentanes JP, Santos JM, Duckett T (2017) FreMEn: Frequency map enhancement for long-term mobile robot autonomy in changing environments. IEEE Trans Rob 33(4):964–977. https://doi.org/10.1109/TRO.2017.2665664
Okal B, Arras KO (2016). Learning socially normative robot navigation behaviors with Bayesian inverse reinforcement learning. In: Proceedings—IEEE international conference on robotics and automation, pp 2889–2895. https://doi.org/10.1109/ICRA.2016.7487452
Hebesberger D, Koertner T, Gisinger C, Pripfl J, Dondrup C (2016). Lessons learned from the deployment of a long-term autonomous robot as companion in physical therapy for older adults with dementia: a mixed methods study. In: ACM/IEEE international conference on human–robot interaction, pp 27–34. https://doi.org/10.1109/HRI.2016.7451730
Rainer K, Ruhnke M, Steder B, Stachniss C, Burgard W (2014) Autonomous robot navigation in highly populated pedestrian zones. J Field Robot 33(1):1–17. https://doi.org/10.1002/rob
Haddadin S, De Luca A, Albu-Schäffer A (2017) Robot collisions: a survey on detection, isolation, and identification. IEEE Trans Rob 33(6):1292–1312. https://doi.org/10.1109/TRO.2017.2723903
Challenger R, Clegg CW, Robinson MA (2009) Understanding crowd behaviours: simulation tools. The Cabinet Office Emergency Planning College. 2009. Available online: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/62640/simulationtools1_0.pdf
Boukas E, Ioannis K, Antonios G, Senior Member IEEE, Georgios C (2015) Sirakoulis robot guided crowd evacuation. IEEE Trans Autom Sci Eng 12(2):739
Trautman P, Krause A (2010) Unfreezing the robot: navigation in dense, interacting crowds. In: 2010 IEEE/RSJ international conference on intelligent robots and systems, Taipei, pp 797–803. https://doi.org/10.1109/IROS.2010.5654369
Lasota PA, Fong T, Shah JA (2014) A survey of methods for safe human–robot interaction. Found Trends Rob 5(4):261–349. https://doi.org/10.1561/2300000052
ISO 13482:2014(E) Robots and robotic devices—Safety requirements for personal care robots
Hirano T, Shiomi M, Iio T et al (2018) How do communication cues change impressions of human–robot touch interaction? Int J Soc Rob 10:21. https://doi.org/10.1007/s12369-017-0425-8
Block AE, Kuchenbecker KJ (2018) Emotionally supporting humans through robot hugs. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction (HRI '18). ACM, New York, NY, USA, pp 293–294. https://doi.org/10.1145/3173386.3176905
Jeong S, Breazeal C, Logan D, Weinstock P (2018) Huggable: the impact of embodiment on promoting socio-emotional interactions for young pediatric inpatients. In: Proceedings of the 2018 CHI conference on human factors in computing systems (CHI '18). ACM, New York, NY, USA, Paper 495. https://doi.org/10.1145/3173574.3174069
Di P et al (2016) Fall detection and prevention control using walking-aid cane robot. IEEE/ASME Trans Mechatron 21(2):625–637. https://doi.org/10.1109/TMECH.2015.2477996
Cifuentes CA, Rodriguez C, Frizera-Neto A, Bastos-Filho TF, Carelli R (2016) Multimodal human–robot interaction for walker-assisted gait. IEEE Syst J 10(3):933–943
Suzuki K (2016) QOLO technology changes life for wheelchair users [industrial activities]. IEEE Robot Autom Mag 23(1):12–12. https://doi.org/10.1109/MRA.2015.2511684
Leaman J, La HM (2017) A comprehensive review of smart wheelchairs: past, present, and future. IEEE Trans Hum Mach Syst 47(4):486–499. https://doi.org/10.1109/THMS.2017.2706727
Rosenstrauch MJ, Kruger J (2017) Safe human–robot–collaboration–introduction and experiment using ISO/TS 15066. In: 2017 3rd international conference on control, automation and robotics, ICCAR 2017, pp 740–744. https://doi.org/10.1109/ICCAR.2017.7942795
Kim KS, Kwok AS, Sentis L (2013) Contact sensing and mobility in rough and cluttered environments. In: 2013 European conference on mobile robots, ECMR 2013—conference proceedings, pp 274–281. https://doi.org/10.1109/ECMR.2013.6698854
Kim KS, Llado T, Sentis L (2016) Full-body collision detection and reaction with omnidirectional mobile platforms: a step towards safe human–robot interaction. Auton Robot 40(2):325–341. https://doi.org/10.1007/s10514-015-9464-x
Huber L, Billard A, Slotine J-J (2019) Avoidance of convex and concave obstacles with convergence ensured through contraction. IEEE Robot Autom Lett 4(2):1462–1469. https://doi.org/10.1109/lra.2019.2893676
Guzzi J, Giusti A, Gambardella LM, Theraulaz G, Di Caro GA (2013) Human-friendly robot navigation in dynamic environments. In: 2013 IEEE international conference on robotics and automation, Karlsruhe, pp 423–430. https://doi.org/10.1109/ICRA.2013.6630610
Savkin AV, Wang C (2014) Seeking a path through the crowd: robot navigation in unknown dynamic environments with moving obstacles based on an integrated environment representation. Robot Auton Syst 62:1568–1580
Palacin J et al (2004) Building a mobile robot for a floor-cleaning operation in domestic environments. IEEE Trans Instrum Meas 53(5):1418–1424
Che Y, Okamura AM, Sadigh D (2018) Efficient and trustworthy social navigation via explicit and implicit robot-human communication. CoRR abs/1810.11556
Trautman P, Krause A (2010). Unfreezing the robot: navigation in dense, interacting crowds intelligent robots and systems (IROS). In: 2010 IEEE/RSJ international conference on IEEE
Shan Y, Koren Y (1995) Obstacle accommodation motion planning. IEEE Trans Rob Autom 11(1):36–49. https://doi.org/10.1109/70.345936
Althoff D, Althoff M, Wollherr D, Buss M (2010) Probabilistic collision state checker for crowded environments. In: 2010 IEEE international conference on robotics and automation anchorage convention district. May 3–8, 2010, Anchorage, Alaska, USA
Trautman P, Ma J, Murray RM, Krause A (2013) Robot navigation in dense human crowds: the case for cooperation. In: 2013 IEEE international conference on robotics and automation, Karlsruhe, pp 2153–2160. https://doi.org/10.1109/ICRA.2013.6630866
Shrestha MC et al (2015) Using contact-based inducement for efficient navigation in a congested environment. In: 2015 24th IEEE international symposium on robot and human interactive communication (RO-MAN), Kobe, pp 456–461. https://doi.org/10.1109/ROMAN.2015.7333673
Kamezaki M, Shrestha M, Tsuburaya Y, Kono R, Sugano S (2018) Utilizing robot’s forearm contact for handling space constraints in congested environment. In: IROS 2018 workshop from freezing to jostling robots: current challenges and new paradigms for safe robot navigation in dense crowds. Madrid October 1, 2018
Billard A (2017) On the mechanical, cognitive and sociable facets of human compliance and their robotic counterparts. Rob Auton Syst 88:157–164
Chen TL, King C-HA, Thomaz AL, Kemp CC (2014) An investigation of responses to robot-initiated touch in a nursing context. Int J Soc Robot 6(1):141–161
ISO 12100:2010. Safety of machinery—General principles for design—risk assessment and risk reduction
Kouris A, Dimeas F, Aspragathos N (2018) A frequency domain approach for contact type distinction in human–robot collaboration. IEEE Rob Autom Lett 3(2):720–727. https://doi.org/10.1109/LRA.2017.2789249
Bruno B, Recchiuto CT, Papadopoulos I et al (2019) Knowledge representation for culturally competent personal robots: requirements, design principles, implementation, and assessment. Int J Soc Rob 11:515–538. https://doi.org/10.1007/s12369-019-00519-w
Argall BD, Billard AG (2010) A survey of tactile human–robot interactions. Robot Auton Syst 58(10):1159–1176. https://doi.org/10.1016/j.robot.2010.07.002
Wortham RH, Theodorou A, Bryson JJ (2016) Robot transparency, trust and utility. Paper presented at AISB Workshop on Principles of Robotics, Sheffield, UK
Salvini P et al (2010) How safe are service robots in urban environments? Bullying a robot. In: 19th international symposium in robot and human interactive communication, Viareggio, pp 1–7. https://doi.org/10.1109/ROMAN.2010.5654677
Alzola Kirschgens L, Zamalloa Ugarte I, Gil Uriarte E, Muniz Rosas A, Mayoral Vilches V (2018) Robot hazards: from safety to security. ArXiv e-prints
Guadarrama-Olvera JR, Dean E, Cheng G (2017) Using intentional contact to achieve tasks in tight environments. In: 2017 IEEE international conference on robotics and automation (ICRA), Singapore, pp 1000–1005
Ferland F, Aumont A, Létourneau D, Michaud F (2013) Taking your robot for a walk: force-guiding a mobile robot using compliant arms. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI), Tokyo, pp 309–316. https://doi.org/10.1109/HRI.2013.6483604
Keijsers M, Bartneck C (2018) Mindless robots get bullied. In: Proceedings of the 2018 ACM/IEEE international conference on human–robot interaction (HRI '18). ACM, New York, NY, USA, pp 205–214. https://doi.org/10.1145/3171221.3171266
Cramer H, Kemper N, Amin A, Evers V (2009) Touched by robots: effects of physical contact and proactiveness. REPORT INS-E0903
Horton J, Cameron A, Devaraj D, Hanson RT, Hajkowicz SA (2018) Workplace safety futures: the impact of emerging technologies and platforms on work health and safety and workers’ compensation over the next 20 years. CSIRO, Canberra 2018. Available Online https://www.inxsoftware.com/media/transfer/doc/workplacesafety.pdf. Accessed on Nov 3, 2018.
Hudson CR, Bethel CL (2018) Stress factors that impact robot operator control in high-stress dynamic scenarios. In: Companion of the 2018 ACM/IEEE international conference on human–robot interaction (HRI 2018). ACM, New York, NY, USA, pp 297–298. https://doi.org/10.1145/3173386.3176917
BS 8611:2016 Robots and robotic devices. Guide to the ethical design and application of robots and robotic systems. The British Standards Institution
‘Report with recommendations to the Commission on Civil Law Rules on Robotics’ published on 27.1.2017. Available online: http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+REPORT+A8-2017-0005+0+DOC+XML+V0//EN. Accessed Dec 19, 2018
Turkle S (2011) Alone together why we expect more from technology and less from each other. Basic Books, New York
Sparrow R (2002) Ethics Inf Technol 4:305. https://doi.org/10.1023/A:1021386708994
Borenstein J, Arkin RC (2016) Robots, ethics, and intimacy: the need for scientific research. In: Conference of the international association for computing and philosophy (IACAP 2016). Ferrara, IT
Turkle S (2006) A nascent robotics culture: new complexities for companionship. AAAI Technology Report Series. Available online http://web.mit.edu/sturkle/www/pdfsforstwebpage/ST_Nascent%20Robotics%20Culture.pdf. Accessed Sept 5, 2012
Fosch-Villaronga E, Albo-Canals J (2019) I’ll take care of you, said the robot. Paladyn J Behav Robot 10(1):77–93. https://doi.org/10.1515/pjbr-2019-0006
Sparrow R, Sparrow L (2006) In the hands of machines? The future of aged care. Mind Mach 16:141. https://doi.org/10.1007/s11023-006-9030-6S
Sharkey A, Sharkey N (2012) Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf Technol 14(1):27–40
Sheehan D, Furfaro K, Brown R (2018) Laguardia airport security robot is giving women the creeps. Available Online https://nypost.com/2018/05/03/laguardia-airports-security-robot-is-giving-women-the-creeps/. Accessed Nov 3, 2018
Caine K, Šabanovic S, Carter M (2012) The effect of monitoring by cameras and robots on the privacy enhancing behaviors of older adults. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction (HRI '12). ACM, New York, NY, USA, pp 343–350. https://doi.org/10.1145/2157689.2157807
Sauppé A, Mutlu B (2015) The social impact of a robot co-worker in industrial settings. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems (CHI '15). ACM, New York, NY, USA, pp 3613–3622. https://doi.org/10.1145/2702123.2702181
Lee J-G, Kim KJ, Lee S, Shin D-H (2015) Can autonomous vehicles be safe and trustworthy? Effects of appearance and autonomy of unmanned driving systems. International
Mori M, MacDorman KF, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Autom Mag 19(2):98–100. https://doi.org/10.1109/MRA.2012.2192811
Nie J, Park M, Marin AL, Sundar SS (2012) Can you hold my hand? Physical warmth in human–robot interaction. In: Seventh annual ACM/IEEE international conference on human–robot inter-action, ACM, pp 201–202
Norman DA (1988) The psychology of everyday things. Basic Books, London
Jamone L, Ugur E, Cangelosi A et al (2016) Affordances in psychology, neuroscience and robotics: a survey. IEEE Trans Cognit Develop Syst. https://doi.org/10.1109/TCDS.2016.2594134
Dragan AD, Bauman S, Forlizzi J, Srinivasa SS (2015) Effects of robot motion on human–robot collaboration. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction (HRI '15). ACM, New York, NY, USA, pp 51–58. https://doi.org/10.1145/2696454.2696473
Lauckner M, Kobiela F, Manzey D (2014) ‘Hey Robot, please step back!’—Exploration of a spatial threshold of comfort for human-mechanoid spatial interaction in a hallway scenario. In: 2014 23rd IEEE international symposium on robot and human interactive communication (IEEE Ro-Man) Book Series: IEEE RO-MAN, pp 780–787
Sardar A, Joosse M, Weiss A, et al (2012) ‘Don't stand so close to me: users' attitudinal and behavioral responses to personal space invasion by robots’. In: HRI'12: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction book series: ACMIEEE international conference on human–robot interaction, pp 229–230
Iio T, Shiomi M, Kamei K, Sharma C, Hagita N (2016) Social acceptance by senior citizens and caregivers of a fall detection system using range sensors in a nursing home. Adv Robot 30(3):190–205. https://doi.org/10.1080/01691864.2015.1120241
Caine K, Šabanovic S, Carter M (2012) The effect of monitoring by cameras and robots on the privacy enhancing behaviors of older adults. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, Boston, MA, USA, 5–8 March, pp 343–350
Złotowski JA, Sumioka H, Nishio S, Glas DF, Bartneck C, Ishiguro H (2018) Persistence of the uncanny valley. In: Ishiguro H, Dalla Libera . (eds) Geminoid studies. Springer, Singapore
Bartneck C, Kulić D, Croft E et al (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Soc Robot 1:71. https://doi.org/10.1007/s12369-008-0001-3
P7001 Working Group: Transparency of autonomous systems. Available online: http://sites.ieee.org/sagroups-7001/
Saerbeck M, van Breemen AJN (2007) Design guidelines and tools for creating believable motion for personal robots. In: RO-MAN 2007—the 16th IEEE international symposium on robot and human interactive communication, Jeju, pp 386–391. https://doi.org/10.1109/ROMAN.2007.4415114
MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in social and cognitive science research. Interact Stud 7(3):297–337. https://doi.org/10.1075/is.7.3.03mac
Saygin AP (2011) The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Soc Cogn Affect Neurosci 7(4):413–422. https://doi.org/10.1093/scan/nsr025
Salvini P, Laschi C, Dario P (2010) Design for acceptability: improving robots’ coexistence in human society. Int J Soc Robot 2:451. https://doi.org/10.1007/s12369-010-0079-2
Sisbot EA, Marin-Urias LF, Alami R, Siméon T (2007) A human aware mobile robot motion planner. IEEE Trans Rob 23:874–883
Pacchierotti E, Christensen H, Jensfelt P (2005) Embodied social inter-action for service robots in hallway environments. Field Serv Robot, 476–487
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166
Dautenhahn K (2007) Socially intelligent robots: dimensions of human–robot interaction. Philos Rrans R Soc B Biol Sci 362(1480):679–704
Takayama L, Pantofaru C (2009). Influences on proxemic behaviors in human–robot interaction. In: IEEE/RSJ international conference on intelligent robots and systems, 2009 (IROS 2009), pp 5495–5502. IEEE
Złotowski JA, Weiss A, Tscheligi M (2012) Navigating in public space: participants' evaluation of a robot's approach behavior. In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction (HRI '12). ACM, New York, NY, USA, pp 283–284. https://doi.org/10.1145/2157689.2157795
Hanajima et al (2005) Investigation of impressions for approach motion of a mobile robot based on psychophysiological analysis. ROMAN 2005. In: IEEE international workshop on robot and human interactive communication, 2005, Nashville, TN, USA, USA, 13–15 Aug 2005. https://doi.org/10.1109/ROMAN.2005.1513760
Pham TQ, Nakagawa C, Shintani A, Ito T (2015) Evaluation of the effects of a personal mobility vehicle on multiple pedestrians using personal space. IEEE Trans Intell Transp Syst 16(4):2028–2037. https://doi.org/10.1109/TITS.2014.2388219
Bartolozzi C, Natale L, Nori F, Metta G (2016) Robots with a sense of touch. Nat Mater 15:921–925. https://doi.org/10.1145/1957656.1957818
Chen TL, King C, Thomaz AL, Kemp CC (2011) Touched by a robot: an investigation of subjective responses to robot-initiated touch. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI), Lausanne, pp 457–464
Spek J (2014) ‘Touched by a robot’. Master thesis, Utrecht University. February 24th, 2014
Li JJ, Ju W, Reeves B (2017) Touching a mechanical body: tactile contact with body parts of a humanoid robot is physiologically arousing. J Hum-Rob Interact 6(3):118–130. https://doi.org/10.5898/JHRI.6.3.Li
Shrestha et al (2015) An investigation into the social acceptance of using contact for inducing an obstructing human. In: 2015 IEEE-RAS 15th international conference on humanoid robots (Humanoids), Seoul, South Korea
Willemse CJAM, Toet A, van Erp JBF (2017) Affective and behavioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction. Front ICT 4:12. https://doi.org/10.3389/fict.2017.00012
Arnold T, Scheutz M (208) Observing robot touch in context: how does touch and attitude affect perceptions of a robot’s social qualities? HRI ’18, March 5–8, 2018, Chicago, IL, USA
Hoffmann L (2018) That robot touch that means so much: on the psychological effects of human–robot touch. Ph.D. Thesis, Von der Fakultät für Ingenieurwissenschaften, Abteilung Informatik und Angewandte Kognitionswissenschaft der Universität Duisburg-Essen zur Erlangung des akademischen Grades. Available Online https://duepublico.uni-duisburg-essen.de/servlets/DerivateServlet/Derivate-43096/Hoffmann_Laura_Diss.pdf. Accessed Nov 3, 2018
McGinn C, Torre I (2019) Can you tell the robot by the voice? An exploratory study on the role of voice in the perception of robots. In: 2019 14th ACM/IEEE international conference on human–robot interaction (HRI), Daegu, Korea (South), pp 211–221. https://doi.org/10.1109/HRI.2019.8673305
Trovato G et al (2018) The sound or silence: investigating the influence of robot noise on proxemics. In: 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN), Nanjing, pp 713–718. https://doi.org/10.1109/ROMAN.2018.8525795
Rios-Martinez J, Spalanzani A, Laugier C (2015) From proxemics theory to socially-aware navigation: a survey. Int J Soc Rob 7:137–153. https://doi.org/10.1007/s12369-014-0251-1
Che Y, Sadig D, Okamura AM (2019) Efficient and trustworthy social navigation via explicit and implicit robot–human communication. https://arxiv.org/abs/1810.11556
Johnson C, Kuipers B (2018) Socially-aware navigation using topological maps and social norm learning. In: Proceedings of the 2018 AAAI/ACM conference on ai, ethics, and society (AIES '18). ACM, New York, NY, USA, pp 151–157. https://doi.org/10.1145/3278721.3278772
We thank warmly Peter Kahn for insightful comments on earlier version of this manuscript. The authors thank Laura Cohen and David Gonon for the illustrations.
Open Access funding provided by EPFL Lausanne. This study was funded by the CROWDBOT project (http://crowdbot.eu/), funded by the European Commission Horizon 2020 ICT (Grant Agreement No. 779942).
Conflict of interest
The authors declare that they have no conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Salvini, P., Paez-Granados, D. & Billard, A. Safety Concerns Emerging from Robots Navigating in Crowded Pedestrian Areas. Int J of Soc Robotics 14, 441–462 (2022). https://doi.org/10.1007/s12369-021-00796-4