1 Introduction

High-quality medical devices are crucial for diagnosis and disease management. Proper medical service procedures rely on not only the professional knowledge of physicians but also the functional performance of medical devices [1]. A low-quality medical device is a key risk factor for poor medical services and patient safety. Clinical engineers (CEs) and biomedical equipment technicians (BMETs) in a hospital’s clinical engineering department are the main staff responsible for the medical devices [2,3,4]. Therefore, CEs and BMETs with professional capabilities and experience with medical devices play a critical role in patient safety. The roles of the clinical engineering department are multiple, and two of the core functions are risk management and quality assurance [2, 3]. Experiences on medical device use are the most effective feedback for maintaining device quality and reducing risks to patients; therefore, several postmarket surveillance systems have been launched to collect adverse event and incident data on medical devices. Examples include medical device reporting (MDR) for the United States Food and Drug Administration [5], medical device market surveillance and vigilance for the European Commission [6], and medical device adverse events reporting (ADR) for the Taiwan Food and Drug Administration [7]. CEs maintain various medical devices in hospitals daily and have firsthand information on medical device use, which could be the most valuable source of event data for a postmarket surveillance system. Medical devices are designed based on various technologies and have complex electromechanical and computer-based system components. It is difficult for CEs, especially juniors, to concurrently solve medical device problems and promote quality assurance. Therefore, a collaborative mechanism to access information on the appropriate responses to adverse events is invaluable. Medical devices are distributed to different hospitals; however, the usage and environments in which the same types of medical devices applied are very similar; CEs working for different medical service providers probably face the same adverse events and product problems. If CEs and their peers working in other hospitals could share their practical experiences on medical devices, working effectiveness would be increased and personal professional competency would be improved.

Experience is important in learning and knowledge acquisition. Through each experience, new knowledge and skills can be obtained [8, 9]. In simulation-based medical education, real-life clinical experience is also required to ensure that learning can be implemented into actual practice [10, 11]. Practical experience with medical devices is significant for CEs working in hospitals. However, most junior CEs lack practical experience in areas such as operating a variety of medical devices, managing medical device adverse events, evaluating the risk level and implementing proper action based on risk control. To overcome the problems of insufficient resource access for junior CEs, the effective remedies include learning through experience and querying senior CEs for advice. CEs work with a variety of medical devices and types of technology; thus, there is a demand for practical suggestions that improve medical device usage and opportunities for continuing education. Senior CEs have experiences in medical device quality management and event analysis. If their experiences could be systematically shared, the demands of junior CEs might be met.

This paper presents a web-based platform for sharing the experiences of CEs on medical device incidents, which is expected to improve the exchange of practical information between hospitals. Different experiences with medical device incidents, termed “events” in this paper, were collected on a website. The platform system’s structure and procedure are described in the Methods section. The data of the platform are provided in the Results section. The benefits of this study are summarized and discussed in the Discussion and Conclusions section.

2 Methods

The sharing platform for medical device incident experience is described in three parts: registered participants, experiences shared on the platform, and web-based system operations. The platform structure is shown in Fig. 1, including member registration, event submitting, event reviewing, and reporting. A detailed description is provided as follows.

Fig. 1
figure 1

System structure of the platform

2.1 Participants in the Platform

Participants were classified into three categories as follows:

  1. (1)

    Member Any medical institutions could sign an agreement to be a “seed hospital” of the platform. Staff with responsibilities related to medical devices in the seed hospital could register as members on the website. CEs, junior CEs, junior biomedical engineers, general affairs personnel, procurement personnel, nurse leaders, and health information engineers were eligible to use the platform. However, CEs and junior CEs were the most welcome.

  2. (2)

    Event reviewer Senior CEs and biomedical engineering experts were invited to be the event reviewers. The reviewers were required to have more than 20 years of work experience in clinical engineering, medical device management, systematic integration, medical device regulation, or healthcare technology management in hospitals. The role of reviewers was to provide practical suggestions based on their professional experience to guide members to solve medical device problems.

  3. (3)

    Platform manager The platform managers managed the permissions of members and reviewers, maintained daily platform operation, and maintained a database.

2.2 Experiences Shared on the Platform

The experiences shared on the platform focused on difficulties in using medical devices and incidents involving them. Any questions related to the life cycle of the devices in hospitals were welcome. According to the content, events were classified into three types as the follows:

  1. (1)

    Medical device management problems Problems related to medical device management, such as new medical device evaluations, service contract reviews, purchase specifications, acceptance, and disposal could be submitted to the platform for suggestions.

  2. (2)

    Medical device maintenance and troubleshooting The hospitals’ clinical engineering departments are mainly responsible for the maintenance and troubleshooting of medical devices, where most of the problems are encountered. Members could submit their concerns from regular work to the platform for advice.

  3. (3)

    Medical device events demonstrated The events demonstrated on the platform were chosen by reviewers according to their practical usage experience with medical devices, especially the general, common, and important devices in hospitals, such as ventilators, hemodialysis machines, and defibrillators. For this type of event, not only the context but also the troubleshooting of the event need to be provided on the platform.

2.3 System Operation

The Linux operating system with a Microsoft MySQL database and the Hypertext Preprocessor (PHP) programming language were applied in the platform, which can be accessed at the following uniform resource locator (URL): http://md-share.cycu.edu.tw. Based on the aforementioned definitions of participants and experience, an experience-sharing process flow was proposed as presented in Fig. 2. With the approval of the seed hospital, the members provided their experiences of problems with medical devices in practice as the events. Each event was reviewed using the platform’s standard review procedure, and suggestions were provided to the user in the last step. Final reports were combined with the event information and review results, and then posted on the platform.

Fig. 2
figure 2

Experiences-sharing process flow of the platform

Establishing an experience-sharing platform is not difficult, and the most important steps are creating an event collection procedure and designing a review mechanism to discern the most useful and practical information. The event collection procedure and review mechanism are described as follows.

2.3.1 Event Collection

The members were required to fill in an online form before submitting an event to the platform to ensure that the event information was adequate and consistent. The details required on the form are listed as follows (Table 1): (1) contact information of the submitter, which was deidentified; (2) basic information about the device, including the product’s name, manufacturer, and device license number; (3) event description detailing where, when, how the event occurred and who was involved; and (4) supporting information to help understand the event, including environmental information, photos, and statistics. Additionally, the submitter was encouraged to note any action or plan resulting from the event, such as corrections.

Table 1 Data requirements for online event submission

2.3.2 Online Review Mechanism

Part 1 Evaluation Indices

Employing a standard evaluation procedure and consistent data format facilitated the subsequent data analysis and management. The reviewers reached a consensus on event analyses after several meetings and discussions. Each event was analyzed based on five evaluation indices (EIs): (1) severity, (2) breadth, (3) frequency, (4) insidiousness, and (5) correctness [12,13,14,15]. Each EI was graded from 1 to 10 (Table 2). Definitions of the five EIs are provided as follows.

Table 2 Event evaluation indices
  1. (1)

    Severity Each event received a grade from 1 to 10 points according to its severity. The criteria to define the severity of an event were derived from the ISO 14971 [15] requirements. The severity of the event was classified into five levels: negligible (2 points), minor (4 points), serious (6 points), critical (8 points), and catastrophic (10 points).

  2. (2)

    Breadth The criteria to define the breadth of an event were based on the range of its effect. An event that involved one person or a single device received a grade of 1 or 2. An event affecting more than three people or two units received a grade of 5. An event affecting at least 10 people or five units received a grade of 10.

  3. (3)

    Frequency The grading criteria were based on the event’s probability of occurring. Both the number of devices and incidents were considered. For example, a hospital has 300 electrical beds and 30 patient monitors, and each type of the devices is involved in 3 incidents during a year. The reviewer may grade 1–3 for the electrical bed and 5–8 for the patient monitor because the incident probability of a patient-monitor event is higher than that of an electrical-bed event. In the review mechanism, an event that seldom occurred was graded 1 or 2. Events that frequently occurred during a certain period of time (which was defined as 1 year in this platform) or were associated with several devices with similar problems received higher grades as empirically decided by reviewers.

  4. (4)

    Insidiousness To recognize and evaluate the insidiousness of an event relied on experts having comprehensive experience in operating multiple medical devices. The criteria to grade the event’s insidiousness depended on whether the event was difficult to identify or it led to subsequent process errors. If the event was easy to identify, which means that the problem was easy to identify, it received a low grade (less than 5). If the event might affect the next process and appear in the next stage, it received a grade of 5. If the event was difficult to identify and other related systems might be affected, it received a grade of 10.

  5. (5)

    Correctness The criteria to grade the event’s correctness were based on whether the event affected data correctness or completeness for diagnosis and treatment. If the data were not affected, it would receive a grade of 1. If the event could cause missing information or a data transmission delay, but the problems could be corrected using a follow-up process, it received a grade of 5. If the data might not be recovered, it received a grade of 10.

Each event was demonstrated using a radar chart with the five EIs (Fig. 3). The radar chart is displayed in the report as a reference for the hazard level of an event.

Fig. 3
figure 3

Event hazard analysis demonstrated in radar chart


Part 2 Cause Analysis

For further analysis, the reviewers evaluated the proportions of causes for each event. Five major causes were identified on the platform based on the quality control analysis [16,17,18].

  1. (1)

    User factor User factors include abuse of the device, accidental misconnection, device misassembly, failure to perform preuse evaluations, improper connection, incorrect control settings, and failure to read the label.

  2. (2)

    Device factor Device factors include abuse of the design, labeling error, device failure, device interference, component failure, and packing error.

  3. (3)

    External factor External factors include medical gas, vacuum, power, and water supply.

  4. (4)

    Support system factor Support system factors include errors in hospital policy, lack of training, and insufficient purchase evaluation.

  5. (5)

    Environment factor Environment factors include an unsuitable structure in the hospital building, changing temperature and humidity levels, and improper storage.

  6. (6)

    Other factor Factors that cannot be classified into any one of five groups are categorized as “other.”

The total proportion of the six factors was 100%. The reviewers decided the distribution proportion of each factor according to the information provided in the event’s description and the reviewer’s professional specialty.


Part 3 Recommendations

In the final part of the evaluation, reviewers wrote a short summary and recommendations for each event, which provided the basis for members to take corrective action. The recommendations were made by each reviewer based on their working experience and integrated knowledge. Additionally, reviewers made suggestions as to whether or not the event should be submitted to the official ARD through the seed hospital’s internal procedures.

Figure 4 depicts the workflow of the review mechanism. The evaluation procedure, including scoring of the five EIs, was implemented online case by case. The reviewers were required to log into the website to review events, and two reviewers independently reviewed each event to attain objective and practical suggestions. Experts in academia were invited, not as reviewers, to finalize the content and suggestions. The experts had to evaluate whether any inappropriate information was presented in the content. If not, the final report would be completed. If there were conflicts between the EI grades and suggestions, a third reviewer was called on and the report would be finalized with three reviewers’ opinions. Generally, a review process should be completed within 15 days and not longer than 1 month. Each event and its report were collected for the event database and posted on the website for members’ self-learning.

Fig. 4
figure 4

Workflow of online review mechanism

2.4 Questionnaire Survey

This study conducted a performance evaluation to assess the platform’s benefits for CEs.

The participants filled out a three-item questionnaire anonymously.

  1. (1)

    Question 1: How long have you worked in a hospital?

  2. (2)

    Question 2: How often do you browse the website per month?

  3. (3)

    Question 3: How do you think the benefits of the platform influenced your duties?

Feedback collected online during the third quarter of 2016 was used to evaluate the platform’s benefits for its members.

3 Results

The results of implementing this platform from January to December 2016 are presented as follows.

3.1 Participants

  1. (1)

    Members The platform has approved 116 members from 32 hospitals. Taiwan has 19 medical centers, 81 regional hospitals, and 324 district hospitals. Among them, 5 medical centers, 17 regional hospitals, and 10 district hospitals were approved to be the seed hospitals for the platform. Among the registered members, 35 members were from medical centers, 46 from regional hospitals, and 35 from district hospitals. The numbers of seed hospitals and members are listed in Table 3.

    Table 3 Numbers of seed hospitals and members registered on the platform
  2. (2)

    Event reviewers Eight senior CEs were invited to be reviewers. Each has at least 20 years of working experience and has been a leader of a hospital’s biomedical engineering department. Three experts, distinguished professors of biomedical engineering, collaborated in the event review process to finalize the report.

3.2 Experiences (Events) Collected

A total of 70 events were submitted to the platform in 2016. According to the device risk classification based on the Taiwan Food and Drug Administration regulations [19], 24 events were in Class I, 42 events were in Class II, 2 events were in Class III, and 2 events did not involve a medical device. Only 3% of the events were related to high-risk devices, and the distribution of event numbers with respect to the device risk classification is shown in Table 4. Furthermore, the information listed in Table 5 represents the category distribution for the 70 events, which was based on the Taiwan Food and Drug Administration regulations [20]. Table 5 shows that cardiovascular devices, general hospital and personal use devices, and general and plastic surgery devices are the top three of the 16 categories, and the proportions are 22.9, 21.4, and 15.7%, respectively. None of the 70 events pertained to immunology and microbiology devices, dental devices, neurological devices, or orthopedic devices. Additionally, two reported events, involving an ethylene oxide cylinder and a reverse osmosis water system, were not included in the medical device regulatory category but were still reported by BMETs and were related to a cleaner and sterilizer.

Table 4 Distribution of events based on device risk classification
Table 5 Distribution of cases according to device category

The majority of the events caused no damage or harm to patients and medical staff. However, in six events, patients incurred injuries from slight to serious, but not permanent. The injuries included a patient’s skin burning from an electrosurgical unit, a patient falling due to poor quality of a bed component, and low SPO2 caused by a ventilator malfunction.

3.3 Events Evaluation Analysis

Among the 70 events, 62 were reviewed online with event reports. Eight events were returned to the submitters for supplemental information. The format of the event report is listed in Table 6. Sixty-two events were evaluated with five EIs, and the total grade distribution for the events is shown in Table 7. To sum up the grade of five EIs for each event, all events received a grade of less than 30, which indicated that the hazard level was not very high, and 71% of the events received a grade between 11 and 20.

Table 6 Items and content of event reports
Table 7 Distribution of events according to the sum of the five indices

Regarding cause analysis, many of the events that occurred were caused by more than a single factor, and accidents usually occurred not only because of equipment failure but also human error. Table 8 shows the number of factors assessed by reviewers with proportion over 30%. Table 8 also shows that 53 times are related to device failure, including poor components, packing errors, and inappropriate software settings. In addition, 13 times and 12 times occurred because of support system failures and user factors, respectively.

Table 8 Distribution of factors number according to cause analysis

Table 9 presents the number of events assessed according the failure code defined by Gonnelli et al. [21] and Wang et al. [22]. Table 9 shows the incidents of medical devices, which were dominated by ACC (accessory and supplies failure) and USE (failure induced by use). UPF (Unpreventable failure), PPF (Predictable and Preventable failure) and PF (Potential failure) followed sequentially. The results can provide a supporting reference for device maintenance.

Table 9 Distribution of events according to failure code

In the final part of the reviews for the 62 events, reviewers provided practical suggestions regarding how to deals with these incidents. Reviewers also suggested that 20 events of the 62 should be reported to the official ADR system through the seed hospital’s internal adverse event report procedures.

3.4 Questionnaire Survey

A simple, anonymous, three-question questionnaire survey was conducted, and 93 participants replied. The results are listed as follows (Table 10).

Table 10 Anonymous questionnaire survey results
  1. (1)

    Question 1: How long have you worked in a hospital?

    • Of the respondents, 28% worked for 1–3 years, 32% worked for 4–6 years, 24% worked for 7–9 years, and 16% worked for more than 10 years.

  2. (2)

    Question 2: How often do you browse the platform in a month?

    • Of the respondents, 46% reported 1–2 times, 36% reported 3–4 times, and 18% reported more than 4 times.

  3. (3)

    Question 3: How do you think the benefits from the platform influenced your duties?

    • Of the respondents, 16% believed it to be extremely helpful, 52% believed it to be very helpful, 26% believed it to be helpful, and 6% believed it to be no help.

4 Discussion and Conclusions

The platform collected the events related to various medical devices in practice through a case-by-case evaluation mechanism, and practical suggestions were provided to improve medical device management. This benefits not only CEs but also all the involved medical staff. According to the reviewers’ evaluation, the website mechanism provides analysis and suggestions with reports posted on the website, so members can apply this information to their daily hospital duties. The events collected by this platform represent information on practical medical device usage. This not only helps CEs to develop professional competency, particularly junior CEs, but also forms a basis for how quality management systems for medical devices in hospitals could be improved.

After this platform was implemented for 1 year, 70 events were submitted by 116 members, distributed among 32 seed hospitals. Sixty-two event reports passed the rigorous evaluation procedure, and only registered members were allowed to read the online content. According to the results of the questionnaire survey, 94% of the participants agreed that the information from the platform was helpful for their work in hospitals.

Today, obtaining information online is easy, and there are many channels for sharing experience; however, most of the information is not supported by solid evidence. The platform in this study has a rigorous event review mechanism: each event was reviewed by at least two experts to provide neutral, professional, and useful recommendations for the members, and a third expert was required only if there were conflicting views. This is valuable in current experience communication sites, especially for medical device management.

According to the World Health Organization’s policies and a statement from the International Federation for Medical and Biological Engineering regarding the World Health Organization’s reform [23, 24], human resources are critical to the global strategy for health, and professional competence is the key factor to perform healthcare services. By improving staff performance through this platform, internal medical device management and risk control in hospitals can be enhanced; moreover, medical device quality assurance systems can be improved, thus creating a safer environment in hospitals for staff and patients. CEs could use the platform to propose improvements and procedural modifications after adverse events when causes are identified. The establishment of this platform has received a positive response from the participants. Despite the design of the platform requiring optimization, it can be a model for training on medical device quality management in the future.