Skip to main content

From Theory to Real-World Integration: Implementation Science and Beyond

  • 24k Accesses


The increasing complexity and dynamicity of our society (and world of work) have meant that healthcare systems have and continue to change and consequently the state of healthcare systems continues to assume different characteristics. The causes of mortality are an excellent example of this rapid transformation: non-communicable diseases have become the leading cause of death, according to World Health Organization (WHO) data, but at the same time there are new problems emerging such as infectious diseases, like Ebola or some forms of influenza, which occur unexpectedly or without advanced warning. Many of these new diseases diffuse rapidly through the different parts of the globe due to the increasingly interconnected nature of the world. Another example of the healthcare transformation is the innovation associated with the introduction and development of advanced communication and technology systems (such as minimally invasive surgery and robotics, transplantation, automated antiblastic preparation) at all levels of care. Consequently, the social and technical dimensions of healthcare are becoming more and more complex and provide a significant challenge for all the stakeholders in the system to make sense of and ensure high quality healthcare. These stakeholders include but are not limited to patients and their families, caregivers, clinicians, managers, policymakers, regulators, and politicians. It is an inescapable truth that Humans are always going to be part of the healthcare systems, and it is these human, who by their very nature introduce variability and complexity to the system (we do not necessarily view this as a negative and this chapter will illustrate). A microlevel a central relationship in focus is that between the clinician and the patient, two human beings, making the health system a very peculiar organization compared to similarly high-risk organizations such as aviation or nuclear energy. This double human being system [1] requires significant effort (good design) in managing unpredictability through the development of personal and organization skills, such as the ability to react positively and rapidly to unexpected events and to adopt a resilient strategy for survival and advancement. In contrast to other similar industries, in terms of level of risk and system safety, healthcare settings are still plagued by numerous errors and negative events involving humans (and other elements) at various levels within the system. The emotional involvement is very high due to the exposure to social relationships daily and results in significant challenges to address both technical and non-technical issues simultaneously.

1 Introduction

1.1 Characteristics of Healthcare and Its Complexity

The increasing complexity and dynamicity of our society (and world of work) have meant that healthcare systems have and continue to change and consequently the state of healthcare systems continues to assume different characteristics. The causes of mortality are an excellent example of this rapid transformation: non-communicable diseases have become the leading cause of death, according to World Health Organization (WHO) data, but at the same time there are new problems emerging such as infectious diseases, like Ebola or some forms of influenza, which occur unexpectedly or without advanced warning. Many of these new diseases diffuse rapidly through the different parts of the globe due to the increasingly interconnected nature of the world. Another example of the healthcare transformation is the innovation associated with the introduction and development of advanced communication and technology systems (such as minimally invasive surgery and robotics, transplantation, automated antiblastic preparation) at all levels of care. Consequently, the social and technical dimensions of healthcare are becoming more and more complex and provide a significant challenge for all the stakeholders in the system to make sense of and ensure high quality healthcare. These stakeholders include but are not limited to patients and their families, caregivers, clinicians, managers, policymakers, regulators, and politicians. It is an inescapable truth that Humans are always going to be part of the healthcare systems, and it is these human, who by their very nature introduce variability and complexity to the system (we do not necessarily view this as a negative and this chapter will illustrate). A microlevel a central relationship in focus is that between the clinician and the patient, two human beings, making the health system a very peculiar organization compared to similarly high-risk organizations such as aviation or nuclear energy. This double human being system [1] requires significant effort (good design) in managing unpredictability through the development of personal and organization skills, such as the ability to react positively and rapidly to unexpected events and to adopt a resilient strategy for survival and advancement. In contrast to other similar industries, in terms of level of risk and system safety, healthcare settings are still plagued by numerous errors and negative events involving humans (and other elements) at various levels within the system. The emotional involvement is very high due to the exposure to social relationships daily and results in significant challenges to address both technical and non-technical issues simultaneously.

The context becomes a key element for understanding how to find a balance in this continuous struggle to manage the social and technical aspects of the healthcare system, to standardize the evidence-based clinical process and personalization of the care related to the diversity of the patients. The analysis of the situational characteristics is vital to understanding how to apply solutions that consider the peculiar dynamicity of healthcare settings. It is also important to underline that, among the general acknowledged diversity, there are some settings which have similar patients and common practices, different risks and a different way to look at safety [2]. The implications are that each context in which care is provided presents with its own unique challenges, practices, risks, and approaches to promote safety. Thus, risk identification and analysis, quality and safety strategies should also be different according to the contextual nuances. For example, a trauma center cannot have the same strategy to improve safety as a blood transfusion service: the trauma center is based on managing the unexpected due to emergency situations while the blood transfusion process is more a planned standardized process. In the trauma center to stay safe, you have to adapt and develop team-based skills, in a blood service you need to make sure the blood is not contaminated and is administered to the right person, and this work that you can easily standardize. This complexity and diversity of healthcare is the main characteristic to keep in mind when trying to understand healthcare systems, and it should be included in any design of the system and in any research intervention project and thus to be able to define effective actions for improvement. Therefore, the purpose of this chapter is to firstly highlight some of the key issues in healthcare relating to adverse events and medical errors. Secondly, to discuss the approaches adopted to ensure quality and safety in healthcare, including some of the new approaches being advocated in the human factors and ergonomics community. Lastly, we will provide some suggests for opening a discussion on the way forward through the integration of various approaches into a coherent transdisciplinary view of healthcare.

1.2 Epidemiology of Adverse Events and Medical Errors

According to the last Consensus Study Report released by The National Academies of Sciences, Engineering and Medicine “Crossing the Global Quality Chasm Improving Health Care Worldwide” healthcare in all global settings today suffers from high levels of deficiencies in quality across many domains, causing ongoing harm to human health [3]. According to WHO global estimates, at least five patients die every minute because of unsafe care. In High Income Countries (HICs), the incidence of adverse events is approximately 9%, of which around 60% could be prevented [4]. A recent Organization for Economic Co-operation and Development (OECD) analysis found that 15% of all hospital costs in OECD nations are due to patient harm from adverse events [5].

In countries with limited resources, every year there are 134 million adverse events related to unsafe care, causing more than 2.6 million deaths annually. Many of these adverse events are largely preventable as they result from unsafe treatment systems, and not patient pathology. In a study on frequency and preventability of adverse events, across 26 low- and middle-income countries, the rate of adverse events was around 8%, of which 83% could have been prevented and most alarmingly 30% led to death [6].

In low- and middle-income countries, a combination of unfavorable factors such as understaffing, inadequate structures and overcrowding, lack of healthcare infrastructure/resources, a shortage of basic equipment, and poor hygiene and sanitation are common place, all of which can be attributed to limited financial resources, contribute to unsafe patient care. A weak safety and quality culture, flawed processes of care and disinterested leadership teams further weaken the ability of healthcare systems and organizations to ensure the provision of safe and effective healthcare [7].

Errors can be classified according to their outcome, the setting where they take place (e.g., inpatient versus outpatient), the kind of procedure involved (medication, surgery, etc.) or the probability of occurrence (high versus low). Error categories are analyzed by taking into consideration their prevalence, avoidance, and associated factors as well as the different strategies for detecting medical errors [8]. Among the problems that commonly occur in healthcare provision are adverse drug events, improper transfusions, misdiagnoses, under and over treatment, unsafe injection practices, surgical injuries and wrong-site surgery, radiation errors involving overexposure to radiation and cases of wrong-patient and wrong-site identification, sepsis, venous thromboembolism, unsafe care in mental health settings including use of restraint, suicide, absconding and reduced capacity for self-advocacy; falls, pressure ulcers, and mistaken patient identities. High error rates with serious consequences are most likely to occur in intensive care units, operating rooms and emergency departments. Medical errors are also associated with extremes of age, new procedures, urgency and severity of medical condition being treated [9,10,11,12]. Medical errors occur right across the spectrum of the assistance process, from prescription to administration and can be attributed to both the social and technical components of the system. In spite of the high prevalence of medical errors and the very evident harm to patients, in many contexts, fear around the reporting of these errors is commonplace, which in turn impedes progress and learning for improvement and error prevention [13].

1.2.1 Barriers to Safe Practice in Healthcare Settings

The experience of countries that are heavily engaged in national efforts to reduce error and increase safe provision of healthcare services, clearly demonstrate that, although health systems differ from country to country, many threats to patient safety have similar causes and often similar solutions. Zecevic (2017) and Farokhzadian (2018) identified the following barriers to safe care provision: heavy workloads, lack of time, lack of resources and poor communication, inadequate organizational infrastructure, insufficient leadership effectiveness, inadequate efforts to keep pace with national and international standards and overshadowed values of team participation [14, 15]. Leape and Berwick (2005) argue that the barriers to the reduction of errors in the context of healthcare remain rooted in the nature and the culture of medicine. Regarding the context of healthcare, the shear complexity of the system, given the many different specialties and parts of the system that are involved in the care process, increases the likelihood of poor interactions and risk of failure [16]. Linked to this, with respect to the culture of medicine, continued professional fragmentation and a lack of teamwork, characterized by different medical specialists or parts of the care process continuing to work in silos, further contribute to the risk of errors in the healthcare system, as found by Hignett et al. (2018) in their study of barriers to the provision of effective healthcare in England. This status quo is perpetuated by a very strong hierarchical, authoritarian structure and the perceived threat that enhanced collaboration and communication may undermine or threaten professional independence and autonomy [16]. Poor or disturbed communication (due to fragmented work structure and poor design of the physical environment, respectively) also present additional barriers to effective and safe practice [17].

Aligned to this is the continued culture of fear around reporting of mistakes or errors made, given the person-centered blame culture that Leape and Berwick (2005) and more recently, Holden (2009) maintains still very much a part of most industries, including aviation and healthcare. In response to this, there is still a need for the development of effective and appropriate reporting and learning systems [18, 19], which, if introduced alongside a just culture, may play an important role in identifying systemic weaknesses, which Woods and Cook (2002) argue is a more effective method of recovering from errors than identifying problematic or “flawed humans” (p. 140). However, in their small study, Mitchell et al. (2016) report that poor reporting processing, a lack of engagement on the part of medical staff to report, poor or no feedback and inaction on events reported, a lack of institution level support and funding and inadequate integration and leveraging of ever-changing health information technology remain as barriers to effective reporting and learning system development and integration.

1.3 Error and Barriers to Safety: The Human or the System?

In 1999, the Institute of Medicine (IOM) released a landmark report—to Err is Human, which many authors argue was a turning point for patient safety in the United States and more globally [20]. Amongst many important recommendations, significant points outlined in the report included the fact that errors, although common and costly, can be prevented to improve patient safety, provided that the systems-related contributory factors to these errors become the focus of addressing safety issues in healthcare (IOM 1999). While many commentators argue that there is an increased appreciation of the systemic nature of errors in the healthcare setting [16, 18,19,20] some still assert that, unfortunately, there is a very prevalent person-centered blame culture in high reliability organizations such as aviation and healthcare, which to some extent is a “psychological tendency and an industry norm” [21]. This way of thinking and error assignment is referred to by Reason (2000) and Dekker (2002) as person approach, which holds that errors occur because of unwanted human variability and fallibility that happens in safe system. This view of error stresses that people working at the sharp end perform unsafe acts, characterized by various errors and violations that arise from abnormal cognitive processes such as forgetfulness and inattention, which can only be rectified by reducing human variability, setting better boundaries through training and discipline and possibly even naming and shaming [22, 23].

In contrast, as highlighted by the IOM and other authors [22, 23] errors can be better understood by taking a systems approach or view. This holds that safety is an emergent property of the way in which a system is designed and not a product of the action of its individual components [21, 24]. From this perspective, errors which occur at the sharp end, are the result of a host of latent systemic conditions or design flaws, or what Reason refers to as “resident pathogens” (2000; p. 769) and active failures of people while performing their work. Therefore, it is not necessarily the human who causes the error (no matter the context) but rather the human’s interactions with the broader system (the tools, tasks, environment, other people in a certain organizational framework and context) which, if the system has latent failures, result in the occurrence of error. Woods and Cook (2002) stress that in order to recover from error there is a need to search for systemic vulnerabilities, while understanding work as it is performed at the sharp end. This enables the detection of latent failures within the design of the system by those who operate within in it, a critical step to informing decision-makers on what needs to be prioritized to improve safety and reduce the likelihood of the same thing happening again.

Effectively, it is critical to understand whether there is compatibility between the social side of work (humans, their beliefs and cultures) and the technical side of work (how it is designed organized and actually executed). This requires an appreciation of sociotechnical systems theory, which is expanded below. Additionally, as articulated in the seminal paper by Rasmussen (1997), to effectively manage risk associated with work, no matter the context, there is a need to consider the various levels of stakeholders involved in the control, regulation, and execution of work. This is captured in Rasmussen’s Hierarchical Risk Management Framework, which stresses the importance of the vertical integration of knowledge and decisions across all stakeholders (which, in this model include Government, Regulators, Company executives, and management and staff at the sharp end) [25]. In other words, knowledge and actions of how work is done and its associated challenges at the sharp end should be communicated up the hierarchy to inform decisions made higher up. Equally, decisions at higher levels should also influence the decisions and action at lower levels [25, 26]. This repeated assertion of the need for vertical integration between different levels of stakeholders within systems, support calls from other authors [16, 18,19,20] who all argue for more national and institutional support for programs aimed at enhancing patient safety, combined with a continued need for multidisciplinary scientific research and management teams. This research, as asserted by Bindman et al. (2018) and Bates and Singh (2018), should be embedded within the context of specific healthcare systems and contribute to the better understanding of problems within specific systems, solutions for which can be developed through learning laboratories and pilot interventions in situ. In order to become more responsive to the calls to understand error from a systemic perspective in the context of healthcare (rather than just as the fault of the human), while fostering better cross-field and cross-hierarchy collaboration amongst relevant stakeholders, the application of different methods, such as implementation science, ethnography, and Human Factors and Ergonomics, may provide a more holistic overview of the challenges within different context. This knowledge can then be leveraged to develop context-specific and culturally sensitive interventions. Following sections therefore highlight these important approaches for ensuring quality and safety in healthcare systems.

2 Approaches to Ensuring Quality and Safety

2.1 The Role of Implementation Science and Ethnography in the Implementation of Patient Safety Initiatives

Treating and caring for people in a safe environment and protecting them from healthcare-related avoidable harm should be national and international priorities, calling for concerted international efforts [13]. Achieving a culture of safety requires an understanding of the values, attitudes, beliefs, and norms that are important to healthcare organizations and what attitudes and behaviors are appropriate and expected for patient safety [27]. Differences between contexts (e.g., policies, culture, and healthcare organization characteristics) may explain variations in the effects of patient safety solutions implementation. Problematically, knowledge of which contextual features are important determinants of patient safety solutions is limited. The lack of understanding could in part be due to the complex nature of unpacking context. As Øvretveit and colleagues have reported (2011), few studies assessed the effect of context on the implementation of safety and quality interventions. In the field of patient safety research, there is little evidence or consensus around which contexts are the most salient for patient safety practice implementation and which contextual factors impact improvement interventions [28]. At the same time, it is hard to identify a unique model for designing and implementing safety interventions that can build a sufficient understanding of highly complex systems such healthcare. Implementation science is one of the most recognized frameworks for transferring evidence-based solutions from the theory of the research to the everyday life of the real world at the frontline. Implementation research is indeed defined in the literature as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services. It includes the study of influences on healthcare professional and organizational behavior” [29].

The aim of implementation research is broader than traditional clinical research as it proposes a systemic analysis not limited solely to assessing the effect of the introduction of a new variable, but rather to verify how this variable impacts on operators, the organization, the physical environment, and up to the highest level of health policies [30]. Implementation-research studies and ethnographic methods of investigation, applied for research in patient safety and clinical risk management, have stressed the importance of organizational and cultural characteristics of the context in the implementation process of intervention. At the core of implementation research lies the idea that every improvement solution has to be oriented to bring an organizational and behavioral improvement triggering virtuous processes toward safety that over time become part of the heritage of the system [31]. Therefore, interventions to improve patient safety would be most effective when developed by those with local “expertise” and local knowledge, while taking into account evidence-based solutions from other contexts [32]. Local expertise and knowledge are indeed critical resources for understanding of what is culturally appropriate, the different priorities and capacities to answer the needs of the populations (resources and infrastructures), and the characteristics and relationships of different health system stakeholders.

According to this approach, the analysis tends to be more holistic, system oriented and amenable to adaptation rather than simply assessing the impact of change factors on the individual components of the system [33]. Here the complexity is not explained in terms of the sum of the individual parts, but in terms of the relationships between the software (non-physical resources such as organizational policies and procedures), hardware (physical resources as workplace, equipment, tools), environment (such as climate, temperature, socioeconomic factors), and liveware (human-related elements as teamwork, leadership, communication, stress, culture), the so-called SHELL model [34].

Implementation science provides research designs that combine methods of quantitative analysis and qualitative investigation. Both qualitative and quantitative methods are essential during the development phase of the intervention and during the evaluation. They combine epidemiological data with an ethnographic analysis [35]. The relevance of ethnographic studies has been highlighted in patient safety since the publication of several reports during the 1970s in the United States [36]. These qualitative studies enable the analysis of the traditional structures and cultural aspects by using methods such as interviews (semi-structured, structured), observation (direct or video), and focus groups [37]. The added value of the ethnographic method lies in its ability to analyze what actually happens in the care settings, to understand how the work is actually done rather than the work as imagined and prescribed [38]. This helps to identify factors and variables that can influence the process at different stakeholder levels, namely patient, caregiver, department, structure, organization, community, and political decision-makers [30].

Several models for translating the implementation science approach into practice have been defined by international agencies and organizations working in the field of safety and quality of care. Some focused on how to build bidirectional collaboration for improvement between stakeholders in different geographical areas and in particular between HICs and LMICs—with one such example being the World Health Organization (WHO) Twinning partnership for improvement (TPI) model [39]. Other approaches focused more on the process to be followed in order to propose safety solutions that are suitable for the specific context, respondent to multidisciplinarity, scalable, sustainable, and adaptable to context and user-needs changes—for example, the Institute for Healthcare Improvement (IHI)’s Collaborative Breakthrough [40] model, while the International Ergonomics Association (IEA) General Framework Model [41] is oriented to understanding the interactions among humans and other elements of a system in order to optimize human well-being and overall system performance. The following sections provide a brief outline of each of these approaches.

2.1.1 WHO Twinning Partnership for Improvement (TPI) Model

The hospital-to-hospital model developed in the WHO African Partnership for Patient Safety (APPS) program provides the foundation on which the “Twinning partnership for improvement” was developed. APPS aimed to build sustainable patient safety partnerships between hospitals in countries of the WHO African Region and hospitals in other regions. TPI takes the learning and experience from across the African region and moves the role of partnership working into new and critical areas to support the development of quality, resilient, and universal health services [39]. At the heart of this model is the fact that partnerships provide a vehicle for dialogue that generates ideas and opportunities to address the multiple barriers to improvement. The focus on solution generation co-developed by hospital partnerships support improvement and generates mutual benefits to all parties involved. The TPI approach to improvement is based on a six-step cycle and facilitates the development of partnerships, the systematic identification of patient safety gaps, and the development of an action plan and evaluation cycle according to the following steps:

  1. 1.

    Partnership development that supports the establishment of fully functioning, communicative twinning relations between two or more health institutions.

  2. 2.

    Needs assessment that allows the baseline situation to be captured, so priority technical areas can be identified to form the basis for an evaluation of the implemented activities.

  3. 3.

    Gap analysis that allows for the identification of key priority areas for focused improvement efforts.

  4. 4.

    Action planning that provides twinning partnerships with the opportunity to jointly agree and develop targeted action plans.

  5. 5.

    Action is the stage of the implementation of the agreed plan of activity with focused action on both arms of the twinning partnership to help deliver effective health services.

  6. 6.

    Evaluation and review enables twinning partnerships to assess, against their baseline, the impact of both their technical improvement work.

2.1.2 Institute for Healthcare Improvement Breakthrough Collaborative

A reference model widely used for the implementation of improvement interventions is the Collaborative Breakthrough model proposed by the Institute for Healthcare Improvement [40]. The principle that underlies the use of this model is that for every intervention to be successful it must be adapted to the context, taking into account the organizational and cultural specifics and the available human and economic resources. Once the area that needs improvement has been identified, actions must be based on evidence in literature, solutions promoted by international actors or experiences already made in other contexts and that have already produced evidence of effectiveness. Multidisciplinary groups of experts evaluate the hypothesized solutions with respect to the available literature, reference standards, and characteristics of the context of application. Social, organizational, anthropological, economic, human factors, and ergonomics knowledge, combined with the clinical knowledge can facilitate a better understanding of the emergent characteristics of the system, which in turn can develop interventions that try to take into account the complexity of the system. According to the model, each intervention—which could be an organizational change, the implementation of a new cognitive support tool or a tool for decision-making—become the object of a pilot project in the specific context and evaluated in terms of usability, feasibility, and impact on quality and safety. In this phase, the Plan-Do-Study-Act model (reference) allows the improvement hypothesis to be periodically reassessed and reformulated in relation to what emerges from the study phase. In the evaluation phase, qualitative and quantitative methods of analysis can be used: questionnaires, interviews, field observations along with pre-post intervention prospective analysis. The results of the tests and the analysis of the data are the basis for a possible redesign of the solution to make it more appropriate for the context of application.

2.1.3 Case Study: Kenya

The Centre for Clinical Risk Management and Patient Safety—WHO Collaborating Centre in Human Factors and Communication of the Delivery of Safe and Quality Care (Italy), in collaboration with the Centre for Global Health of the Tuscany Region and the University Hospital of Siena in 2015 promoted a partnership with a hospital in Kenya with focus on patient safety and quality improvement. The operative approach promoted for introducing improvement solutions and strategies in the hospital combined the WHO African Partnership for patient safety approach with the Institute of Healthcare Improvement Collaborative Breakthrough model. Following the six-step cycle approach of the APPS, on the ground quantitative self-assessment, a gap analyses and need assessment were conducted, from which it emerged that there was a need to work on the safety and quality of maternal and neonatal care. Partners thus decided to focus on building a collaborative project for the implementation of the Safe Childbirth Checklist and to evaluate the locally adapted version of the tool in terms of impact on safety and quality, its usability, and feasibility.

The process of implementation has combined the Collaborative Breakthrough model and the Twinning Partnership for Improvement and has foreseen the following steps:

  1. 1.

    Evaluation of the specific characteristics of the context in terms of: safety culture, resources and technology available, organization of the work, work flows, characteristics of the workers, their relations and needs, cognitive workload.

  2. 2.

    Administration of a questionnaire to assess the level of maturity of the safety culture (Surveys on Patient Safety Culture™ (SOPS™) Hospital Survey released by the Agency for Healthcare Research and Quality (AHRQ) [42].

  3. 3.

    Creation of a multidisciplinary group for the personalization of the SCC: gynecologists, midwives, and nurses form the maternal and child department, safety and quality team of the hospital, quality and safety, and HFEs experts from partner institution.

  4. 4.

    Coaching of the frontline workers on the use of the SCC tool.

  5. 5.

    Six-month piloting of the SCC.

  6. 6.

    Evaluation of the impact of the SCC on some selected process indicators related to the care delivered to the mother and the new-born.

  7. 7.

    Administration of a questionnaire to evaluate the usability and feasibility of the tool.

  8. 8.

    Application of the PDSA for re-evaluating the first version of the SCC and re-customization of the tool according to the results of clinical record review and the usability questionnaire.

The analyses of the AHRQ Hospital Survey on Patient Safety administrated to a group of 50 hospital workers to measure their perception about patient safety issues, medical errors, and reporting showed that workers felt that top management was committed to improving patient safety and that this represented a positive platform for developing quality and safety interventions. Additionally, about 50% of the staff associate the occurrence of an adverse event to potentially being blamed rather than the event being used as a learning opportunity. Linked to this, most of the health workers reported that there is a limited culture of reporting events related to near-misses and that when a few adverse events have been reported and discussed, this produces positive change. Lastly, staff indicated that they wanted to be part of a positive environment for teamwork and collaboration with top management.

The second source of evaluation of the introduction of the SCC was a questionnaire administrated to users aiming at understanding whether the checklist was usable, coherent with the workflow and work organization, whether it overloaded workers or it facilitate communication, teamwork, and adherence to best clinical practices. The result of the questionnaire showed that: 70% of the midwives considers the checklist easy or very easy to us; 56% said that the tool had significantly improved their practice around childbirth, and 50% reported that it had significantly improved communication and teamwork.

Finally, the evaluation of the impact of the SCC on quality and safety of care was conducted through a prospective pre- and post-intervention clinical records review on a randomly selected sample of clinical records. The analyses shown that the introduction of tool had led to a significant increase in the evaluation of heart rate during pre-partum, the administration of the antibiotic therapy in case of mother’s temperature >38° or in case of membranes’ rupture >24 h, the administration of antihypertensive treatment in case of diastolic blood pressure >120 [43].

2.2 Challenges and Lessons Learned from the Field Experience and the Need for More Extensive Collaboration and Integration of Different Approaches

The implementation of the Safe Childbirth Checklist in Kenya represented one of the first attempts to merge internationally validated models for quality and safety improvement in healthcare. The positive results obtained in terms of clinical and organizational outcomes demonstrated that the integration of the two models can give significant support for understanding and identifying what should be done to promote improvement, what kind of interventions are the most suitable and effective for a specific context. Following the TPI six-step cycle and the QI approach, it is possible to describe the level of maturity of a system in terms of safety culture and safety “logistics” (needs assessment); to identify possible gaps in the care process and the clinical areas where an intervention is necessary; to plan actions according to the gap analyses and act according to the characteristics of the environment while testing hypotheses aimed at improvement and possible prototypes. However, the understanding of the key technical and social aspects that required changing for effective implementation were not always made explicit by these approached. Therefore, what needs to be further investigate and discussed is how HFE can become a driving component of safety and quality improvement programs. A more HFE-oriented approach aimed at promoting behavioral changes toward safer healthcare systems, could promote a deeper understanding of technical, socioeconomic, political and environmental sub-systems when trying to build an understanding of the work system characteristics. Moreover, a more comprehensive understanding of the relation between all the component of the systems, different stakeholders that act in the context at different levels, their relation and needs could help to scale-up solution from the local to the national level keeping a bottom-up approach for the design of the solution. In other words, HFE could make it explicit how to make changes toward safety of care happen, how to fit theory into the real world, in the specific context, taking into account peculiarities of the system and promoting multidisciplinary collaboration for facing, in an holistic manner, multidimensional issues such as those that arise from a high-complexity systems as the healthcare.

2.3 Human Factors and Ergonomics

According to the International Ergonomics association “Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data and methods to design in order to optimize human well-being and overall system performance.” Wilson (2014) further argues that HFE has six fundamental notions that define the approach that should be adopted by practitioners and researchers: (1) Systems approach; (2) Context; (3) Interactions; (4) Holism; (5) Emergence; (6) Embedding. In other words, HFE takes a systems approach that acknowledges the importance of context, emergence and holism in elucidating interactions between various system elements and developing this understanding requires being embedded in the system. This suggests that HFE should always be embedded in the practice of healthcare for effective patient safety and therefore HFE (and consequently those responsible for implementation) should be viewed as part of the organization and not as outside consultants. At the heart of the embedded approach to HFE is the participation of all key stakeholders and subject matter experts [44]. In fact, participatory ergonomics is well established, for example, almost 20 years ago Haines et al. (2002) proposed and validated a participatory ergonomics framework. The participatory ergonomics approach focuses on the involvement of people in both the planning and controlling a significant amount of their own work activities. This is coupled ensuring that they have sufficient knowledge and power to have an influence on processes and outcomes [45]. Due to the focus on and acknowledgment of stakeholders at all levels in the system HFE also promotes a micro, meso, and macro view of the system. At a micro level, the focus would be on the individual and their interactions with their task (e.g., between a nurse and their patient), while the meso level takes a slightly broader view at a group or team level and their interaction with work. Lastly, at the macro level the characteristics of the whole system is taken into account and organizational factors need to be considered. Important models at this level of analysis would be those developed by Rasmussen (1997), the specifics of which are discussed elsewhere in this chapter as they promote both a top-down and bottom-up approach.

Human factors and ergonomics has its focus on the interactions between humans, technologies, and organizations within a physical and cultural environment. Fundamental notions of HFE mean that the tools and methods that support the implementation of patient safety interventions can be adapted to the context needs of local stakeholders. Further the approach considers the interaction with healthcare operators, acknowledging several dimensions of the implementation site at the different level of the system: micro, meso, and macro (i.e., it promotes a systemic view of the implementation process). The main interactions are those that are derived from the complexity of the system and in particular hospital organization (design of clinical pathways, healthcare operator workloads and shifts, protocols, procedures, tasks, and activities), environment/physical organization (facilities, furniture and device design; technical and economic resources) and human aspects influencing care delivery (religion, customs, social behaviors, social organization, social hierarchies).

From a healthcare perspective the dual outcomes of HFE could be reoriented as patient outcomes (quality of care and patient safety) and employee and organization outcomes [46]. Importantly, HFE acknowledges the interdependence of these two outcomes. That is, in order to promote patient safety outcomes it is necessary to promote organizational outcomes (including the well-being of those working within these organizations). The ability of HFE to support these two outcomes is dependent on its understanding of sociotechnical systems theory and its values. Considering the clear social and technical characteristics of healthcare highlighted earlier in this chapter, an understanding of sociotechnical systems theory is of obvious benefit here. Clegg (2000) argued that sociotechnical systems theory “has at is core the notion that the design and performance of new systems can be improved, and indeed can only work satisfactorily, if the social and the technical are bought together and treated as interdependent aspects of a work system.” Human factors and ergonomics practitioners therefore take the technical (processes, tasks and technology used to transform inputs to outputs), social (attributes of people (such as skills, attitudes, values), relationships among people, reward systems) and environmental sub-systems (outside influences such as stakeholders) into account when trying to build an understanding of the work system characteristics. Sociotechnical systems principles were first proposed by Cherns in 1976 and have subsequently been developed by several authors including Clegg (2000). Recently, Read et al. proposed a set of values for HFE and sociotechnical systems theory based on these principles:

  1. 1.

    Humans as assets

  2. 2.

    Technology as a tool to assist humans

  3. 3.

    Promotion of quality of life

  4. 4.

    Respect for individual differences

  5. 5.

    Responsibility to all stakeholders

HFE therefore places an emphasis on seeing the humans within the system (patients, caregivers, etc.) as assets rather than “problems” or potential for introducing error. These principles and values are again consistent both with the participatory ergonomics principles and with recent calls for transdisciplinary teams focused on engaging with all relevant stakeholders. It is therefore clear that HFE is a salient discipline for the problems faced by the healthcare system relating to patient safety.

The application of the HFE participatory approach within healthcare has been extensively researched with Hignett et al. (2005) illustrating the numerous benefits associated with such an approach. Within the context of this book chapter, the ability of participatory ergonomics tactics to promote transdisciplinarity in team characteristics [47], is also an important consideration [46]. This is vital as earlier aspects of this chapter highlight the increasing need for transdisciplinary team collaboration for solving complex healthcare and patient safety issues. Unfortunately, currently HFE is only well established in the West and has little traction in many countries in the Global South (see Thatcher and Todd 2019 for further details [46]). Furthermore, when there are multinational transdisciplinary teams working in healthcare in emerging economies, the nature of the collaboration is typically poor; this is in spite of good practice frameworks existing. Schneider and Maleka (2018) and Hedt-Gauthier et al. (2018) have both illustrated the problematic nature of these relationships in healthcare. These problems are not isolated to healthcare settings, with Thatcher and Todd (2019) that it is necessary to foster respectful progress through a program of action that acknowledges the lessons that the people of the Global South can teach the North.

3 Way Forward

3.1 International Ergonomics Association General Framework Model

International Ergonomics Association in response the problems identified above has developed a General Framework Model that is focused on using the values of HFE to guide their interactions and collaborative development efforts in LMICs. Evidence of patient safety interventions have been mainly based on high cost projects in HICs. This evidence needs translation and adaptation when developed for LMICs. Human Factors and Ergonomics (HFE) and in particular the IEA General Framework Model are the suggested research approaches to adapt tools to the context within which they will be applied. Indeed Thatcher and Todd (2019) recently argued that training and implementation models must focus on up skilling local capacity allowing LIC and LMIC countries to solve their own problems, thus recognizing the emergent characteristics of patient safety issues and the emergent nature of organizational culture. The IEA approach is consistent with this and is underpinned by several philosophical standpoints published in the international development standing committee of the IEA triennial report from 2018. These focus on:

  1. 1

    An engagement with, and understanding of, how knowledge and technology are effectively diffused across countries. That is, diffusion occurs within sociotechnical systems and as such should be negotiated, enabled, and diffused (Greenhalgh et al. 2004)

  2. 2

    Using the relationship between stakeholders, emergence and networks as promoted by Wheatley and Frieze to promote the development of communities of good practice and then translate these into systems of influence

  3. 3

    Closer alignment and integration of science and practice

The IEA general framework model was developed based on the aforementioned principles and focused on the provision of a participatory framework to facilitate the systematic design of HFE-related projects. The GFM outlined in Fig. 12.1 although presented in an eight step model is in fact a highly iterative process, as the characteristics at one step are made explicit they may require the reexamination of previous steps. For example as the understanding of who stakeholders are (step 4) and what the relationships are between stakeholders (step 5) is developed so the understanding of what a value-added topic is (step 1) and what the actual needs are (step 3) may need to be refined. Through this iterative process, the various stakeholders within the system are able to discover shared objectives and goals, and consequently collaborate in the generation of ideas on the solutions to be implemented within the constraints of system they are attempting to shape. The framework therefore promotes an interrogation of the social characteristics of the system (through a detailed examination of the various stakeholders and their relationships to each other) and how the technical aspects of the system can be aligned with the strengths and weaknesses of various stakeholders through the development of benefits and implementation strategies. The framework also promotes the use of contextually appropriate tools and methods at each step that meet the requirements of elucidating the necessary information. For example, in more advanced systems the initial steps (1–3) can be facilitated through the use of existing HFE tools such as cognitive work analysis, while in less mature systems alternative tools may be more appropriate.

Fig. 12.1
figure 1

International Ergonomics Association model for the promotion of collaboration across multiple stakeholders within a system

As mentioned in Sect. 12.2.2 and as emerged from the overview on barriers and facilitating factors that can influence the positive results of an improvement project, context, and its actors (stakeholders) represent the main elements to take into account when designing and implementing solutions. This requires an appreciation of both the social and technical components of the system within which the improvement project is to take place. Therefore, just understanding the context is not sufficient for the success of interventions that aim at creating a long-lasting behavioral change that become part of the cultural heritage of a specific system and a shared and recognized attitude. In order to make this cultural change lasting over time, it has to be embedded in the system, it need to be thought of, designed, and implemented by actors that participate in the system, that are part of the system and that are recognized as to be parts of that systems. Furthermore, the emerging characteristics of safety and culture need to be taken into account, and those that remain within the system once the improvement project is complete need to be empowered to understand the system and to respond appropriately to new emergent problems.

Considering the case study on the introduction of the SCC in one hospital of Kenya, we argue that the application of the GFM model would have possibly represented for the implementers a fundamental step before the start of the collaborative to better understand the sociotechnical characteristics of the setting and thus reduce possible challenges and improve the sustainability of improvement made. At the beginning of the project, no HFE experts where available within the hospital nor experts in safety and quality of care. External experts with little knowledge about the particular characteristics of and the level of maturity of the systems in terms of safety culture and safety “logistic” would have been facilitated in the understanding how to make the new improvement solution working in the everyday local way of working at the frontline by the application of the GFM. This would be an initial step in ensuring that all local stakeholders are identified, valued, empowered, and included in problem identification and solution finding. As such an important first step in the process of making HFE knowledge and principles (and for that matter safety and quality healthcare) available on the ground through transfer of knowledge and coaching would have taken place.

For sure the bottom-up approach followed in the introduction of the SCC has been made possible to have a direct participation of hospitals stakeholders from the very beginning of the project but to date it has not be sufficient in order to turning it into a large-scale project and to involve also macro-systems level actors such as institutional bodies.

As we continue to seek to improve the provision of healthcare across the globe, a deeper integration between quality and safety improvement models and the HFE models would be an important and useful departure point. Implementation science and HFE promote a systemic view of patient safety and advocate for a movement aware from disciplinarily to multi- and transdisciplinary approaches to solution finding. It is our contention that integrating our models to foster such an approach coupled with an acknowledgment of local knowledge and skills in LMICs are vital for future improvement projects. In such an integrated manner would it be possible to take implementation of both quality and safety improvement and human factors and ergonomics projects beyond their current scope.


  1. Bagnara S, Parlangeli O, Tartaglia R. Are hospitals becoming high reliability organizations. Appl Ergon. 2010;41(5):713–8. Epub 2010 Jan 27.

    CrossRef  Google Scholar 

  2. Vincent C, Amalberti R. Safer healthcare. Strategies for the real world. Cham: Springer Open; 2016.

    CrossRef  Google Scholar 

  3. National Academies of Sciences, Engineering, and Medicine. Crossing the global quality chasm: improving health care worldwide. Washington, DC: The National Academies Press; 2018.

    CrossRef  Google Scholar 

  4. Tartaglia R, Albolino S, Bellandi T, Bianchini E, Biggeri A, Fabbro G, Bevilacqua L, Dell’erba A, Privitera G, Sommella L. Adverse events and preventable consequences: retrospective study in five large Italian hospitals. Epidemiol Prev. 2012;36(3–4):151–61.

    PubMed  Google Scholar 

  5. Slawomirski L, Auraaen A, Klazinga N. The economics of patient safety. Strengthening a value-based approach to reducing patient harm at national level. Paris: OECD; 2017.

    Google Scholar 

  6. Wilson RM, Michel P, Olsen S, Gibberd RW, Vincent C, et al. Patient safety in developing countries: retrospective estimation of scale and nature of harm in patient and hospital. BMJ. 2012;344:e832.

    CAS  CrossRef  Google Scholar 

  7. Hignett S, Lang A, Pickup L. More holes than cheese. What prevents the delivery of effective, high quality and safe health care in England? Ergonomics. 2018;61(1):5–14.

    CrossRef  PubMed  Google Scholar 

  8. La Pietra L, Calligaris L, Molendini L, Quattrin R, Brusaferro S. Medical errors and clinical risk management: state of the art. Acta Otorhinolaryngol Ital. 2005;25(6):339–46.

    PubMed  PubMed Central  Google Scholar 

  9. Brennan TA, Leape LL, Laird NM, Hebert L, Localio AR, Lawthers AG, et al. Incidence of adverse events and negligence in hospitalized patients. 2010 [cited 2019 Oct 18]. Available from:

  10. A WHO, Safe childbirth checklist programme: an overview. Geneva: WHO; 2013.

    Google Scholar 

  11. Jha AK, Larizgoitia I, Audera-Lopez C, Prasopa-Plaizier N, Waters H, Bates DW. The global burden of unsafe medical care: analytic modelling of observational studies. BMJ Qual Saf. 2013;22(10):809–15.

    CrossRef  Google Scholar 

  12. Boadu M, Rehani MM. Unintended exposure in radiotherapy: Identification of prominent causes. Radiother Oncol. 2009;93(3):609–17.

    CrossRef  Google Scholar 

  13. World Health Organization. Patient safety: making health care safer. 2017 [cited 2019 Oct 16]. Available from:

  14. Zecevic AA, Li AH-T, Ngo C, Halligan M, Kothari A. Improving safety culture in hospitals: facilitators and barriers to implementation of systemic falls investigative method (SFIM). Int J Qual Health Care. 2017;29(3):371–7.

    CrossRef  Google Scholar 

  15. Farokhzadian J, Dehghan Nayeri N, Borhani F. The long way ahead to achieve an effective patient safety culture: challenges perceived by nurses. BMC Health Serv Res. 2018;18(1):654.

    CrossRef  Google Scholar 

  16. Leape LL, Berwick DM. Five years after to err is human: what have we learned? JAMA. 2005;293(19):2384–90.

    CAS  CrossRef  Google Scholar 

  17. Hignett S, Lang A, Pickup L, Ives C, Fray M, McKeown C, et al. More holes than cheese. What prevents the delivery of effective, high quality and safe health care in England? Ergonomics. 2018;61(1):5–14.

    CrossRef  Google Scholar 

  18. Clancy CM. Ten years after to err is human. Am J Med Qual. 2009;24(6):525–8.

    CrossRef  Google Scholar 

  19. Mitchell I, Schuster A, Smith K, Pronovost P, Wu A. Patient safety incident reporting: a qualitative study of thoughts and perceptions of experts 15 years after ‘To Err is Human’. BMJ Qual Saf. 2016;25(2):92–9.

    CrossRef  Google Scholar 

  20. Bates DW, Singh H. Two decades since to err is human: an assessment of progress and emerging priorities in patient safety. Health Aff. 2018;37(11):1736–43.

    CrossRef  Google Scholar 

  21. Holden RJ. People or systems? To blame is human. The fix is to engineer. Prof Saf. 2009;54(12):34.

    PubMed  PubMed Central  Google Scholar 

  22. Reason J. Human error: models and management. BMJ. 2000;320(7237):768–70.

    CAS  CrossRef  Google Scholar 

  23. Dekker SW. The re-invention of human error. Hum Factors Aerospace Saf. 2001;1(3):247–65.

    Google Scholar 

  24. Woods DD, Cook RI. Nine steps to move forward from error. Cogn Tech Work. 2002;4(2):137–44.

    CrossRef  Google Scholar 

  25. Rasmussen J. Risk management in a dynamic society: a modelling problem. Saf Sci. 1997;27(2–3):183–213.

    CrossRef  Google Scholar 

  26. Cassano-Piche AL, Vicente KJ, Jamieson GA. A test of Rasmussen’s risk management framework in the food safety domain: BSE in the UK. Theor Issues Ergon Sci. 2009;10(4):283–304.

    CrossRef  Google Scholar 

  27. Ghobashi MM, El-Ragehy HAG, Ibrahim HM, Al-Doseri FA. Assessment of patient safety culture in primary health care settings in Kuwait. Epidemiol Biostat Public Health. 2014;11(3) [cited 2019 Oct 21]. Available from:

  28. Taylor SL, Dy S, Foy R, Hempel S, McDonald KM, Øvretveit J, Pronovost PJ, Rubenstein LV, Wachter RM, Shekelle PG. What context features might be important determinants of the effectiveness of patient safety practice interventions? BMJ Qual Saf. 2011;20:611–7.

    CrossRef  PubMed  Google Scholar 

  29. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1(1):1–3.

    CrossRef  Google Scholar 

  30. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):1–12.

    CrossRef  Google Scholar 

  31. Hawe P, Shiell A, Riley T, Gold L. Methods for exploring implementation variation and local context within a cluster randomised community intervention trial. J Epidemiol Community Health. 2004;58:788–93.

    CrossRef  Google Scholar 

  32. Øvretveit JC, Shekelle PG, Dy SM, et al. How does context affect interventions to improve patient safety? An assessment of evidence from studies of five patient safety practices and proposals for research. BMJ Qual Saf. 2011. Published Online First: 13 Apr 2011;

  33. Cristofalo MA. Implementation of health and mental health evidence-based practices in safety net settings. Soc Work Health Care. 2013;52(8):728–40.

    CrossRef  Google Scholar 

  34. Hawkins FH, Orlady, H.W. (Ed.). Human factors in flight, vol. 1993. Aldershot: Avebury Technical; 1993.

    Google Scholar 

  35. Cupit C, Mackintosh N, Armstrong N. Using ethnography to study improving healthcare: reflections on the “ethnographic” label. BMJ Qual Saf. 2018;27(4):258–60.

    CrossRef  Google Scholar 

  36. Dixon-Woods M. Why is patient safety so hard? A selective review of ethnographic studies. J Heal Serv Res. 2010;15(Suppl 1):11–6.

    CrossRef  Google Scholar 

  37. Magazi B, Stadler J, Delany-Moretlwe S, Montgomery E, Mathebula F, Hartmann M, et al. Influences on visit retention in clinical trials: Insights from qualitative research during the VOICE trial in Johannesburg, South Africa. BMC Womens Health. 2014;14(1):1–8.

    CrossRef  Google Scholar 

  38. Hollnagel E, Wears R, Braithwaite J. From safety-I to safety-II: a white paper. 2015. p. 1–32.

    Google Scholar 

  39. Recovery Partnership Preparation Package. Twinning partnerships for improvement. Geneva: World Health Organization; 2016. p. 20.

    Google Scholar 

  40. Institute for Healthcare Improvement. The breakthrough series: IHI’s collaborative model for achieving breakthrough improvement (IHI innovation series white paper). Cambridge: Institute for Healthcare Improvement; 2003.

    Google Scholar 

  41. International Ergonomic Association.


  43. Dagliana G, Tommasini B, Zani S, Esposito S, Akamu M, Chege F, Ranzani F, Caldes MJ, Albolino S. WHO safe childbirth checklist: the experience of Kenya according to the WHO African Partnership for Patient Safety. In: Proceedings of the 20th congress of the International Ergonomics Association (IEA 2018), Healthcare ergonomics, vol. I. Cham: Springer; 2018.

    Google Scholar 

  44. Wilson J. Fundamentals of systems ergonomics/human factors. Appl Ergon. 2014;45:5–13.

    CrossRef  Google Scholar 

  45. Hignett S, Carayon P, Buckle P, Catchpole K. State of science: human factors and ergonomics in healthcare. Ergonomics. 2013;56(10):1491–503.

    CrossRef  Google Scholar 

  46. Thatcher A, Todd A. HFE in underdeveloped countries. How do we facilitate equitable, egalitarian, and respectful progress. In: Roscoe R, Chiou E, Wooldridge A, editors. Advancing diversity, inclusion, and social justice through human systems engineering. Boca Raton, FL: CRC Press; 2020.

    Google Scholar 

  47. Naweed A, Ward D, Gourlay C, Dawson D. Can participatory ergonomics process tactics improve simulator fidelity and give rise to transdisciplinarity in stakeholders? A before-after case study. Int J Ind Ergon. 2018;65:139–52.

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Giulia Dagliana .

Editor information

Editors and Affiliations

Rights and permissions

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Reprints and Permissions

Copyright information

© 2021 The Author(s)

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Dagliana, G., Albolino, S., Mulissa, Z., Davy, J., Todd, A. (2021). From Theory to Real-World Integration: Implementation Science and Beyond. In: Donaldson, L., Ricciardi, W., Sheridan, S., Tartaglia, R. (eds) Textbook of Patient Safety and Clinical Risk Management . Springer, Cham.

Download citation

  • DOI:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-59402-2

  • Online ISBN: 978-3-030-59403-9

  • eBook Packages: MedicineMedicine (R0)