Background

The first step in the development of an outcomes-based undergraduate medical curriculum is the performance of a needs assessment to ascertain what junior doctors are expected to know [1, 2]. The results of such a needs assessment serve to inform those involved with curricular design of the core knowledge and skills that medical students need to acquire during their undergraduate training [1]. In the absence of expert consensus, however, curricular content is subject to the opinion of individual lecturers and, therefore, variable between academic institutions [3].

Worldwide, graduating medical trainees lack adequate ECG competence [4,5,6,7], i.e. the ability to accurately analyse and interpret an electrocardiogram (ECG) [8]. Yet, ECG competency is considered an Entrustable Professional Activity (EPA) that medical students need to master prior to graduation [9]. It is important to consolidate ECG knowledge and skills before qualifying, as there is usually little formal training in electrocardiography once medical students graduate [10].

Even though electrocardiography forms part of core undergraduate medical training [11], there is a lack of guidance as to which ECG diagnoses should be taught to medical students [5]. In a recent systematic review, it was found that there was significant variation in topics of undergraduate ECG instruction [12]. This could be explained by the inconsistency in undergraduate ECG curricular recommendations in the literature [9, 13]. Central to the process of addressing the lack of ECG competence is the establishment of a mutually agreed curriculum.

Establishing consensus using the Delphi method

Delphi studies are a recognised method for establishing expert consensus in curricular development [14]. The Delphi technique is an iterative process through which expert opinion is transformed into consensus amongst experts [15]. Experts in the field are invited to complete multiple rounds of questionnaires. These questionnaires are completed anonymously, and the collective results are shared with participants in subsequent rounds [16, 17].

The classical Delphi method starts with a set of open-ended questions (to collect qualitative data) in the first round. Participants’ responses are then summarised and used to create closed-ended questions (to collect quantitative data) for the subsequent rounds [18, 19]. However, multiple studies in health professions education have adopted a modified Delphi technique wherein the first round already starts with closed-ended questions that are carefully selected by the convener through literature reviews and expert consultation [20,21,22]. As the methodology is flexible, a modified Delphi study can still collect input through open-ended questions, by asking participants if they have any additions to the list prepared by the convener [14, 23].

In a Delphi study, quantitative data is collected by means of directed questions, in the form of Likert-type questions, through which participants indicate how strongly they agree or disagree with each statement on the list in the survey of each round [16, 24]. Likert-type questions typically ask, “please select how strongly you agree with the following statement…”. Most studies use five response categories (i.e. “strongly disagree”, “disagree”, “uncertain”, “agree”, “strongly agree”), with a central point (i.e. uncertain) to allow for participants to opt out if they are not sure about the statement [3, 15, 22, 25, 26]. Frequencies and mode are appropriate descriptive statistics for the categorical data collected by Likert-type questions [22, 24, 27]. Frequencies indicate variability of the data, i.e. the level of agreement for each statement in the survey [28], whereas the mode indicates the central tendency of the data (i.e. the response most commonly selected).

The level of agreement amongst participants that is considered as consensus varies between 51 and 80% in the literature on Delphi studies [14]. Investigators decide a priori on the level of agreement that would be considered as having reached consensus [29]. Although there is no universal value that is used for this purpose, many studies use 75% agreement between experts as the cut off value to establish consensus in Delphi studies [29]. Surveys are administered through multiple rounds until the predetermined level of consensus for each statement is reached. This usually occurs after the third round of the study [14, 16].

There are no rigid criteria for the selection of participants in a Delphi study, neither how many participants should be recruited [30]. The investigator needs to take great care in the selection of potential participants [31]. Participants that are invited to take part in a Delphi study should be content and context experts, so that the results can be accurate and reliable [32, 33]. The panel of experts invited to take part should have a keen interest in the subject matter [26]. Also, because of the risk of losing participants between successive rounds, those invited to take part in the study should be willing to take part in a multi-stage surveying process [15].

The aim of this study was to establish consensus (amongst specialists who regularly analyse and interpret ECGs in clinical practice [i.e. content experts], and who are involved in ECG training [i.e. context experts]), on an outcomes-based undergraduate electrocardiography curriculum for medical students.

Methods

This study used a modified Delphi technique to establish consensus on a curriculum for undergraduate ECG training.

Delphi expert panel

Cardiologists, Specialist Physicians, Emergency Physicians, Family Physicians and Medical Education Specialists at the eight medical schools of South Africa were invited to take part in this modified Delphi study. The purpose of the study was explained in the letter of invitation. On acceptance to take part, an email with a link to the online survey was sent to the participant. Consent for participation in the study was obtained electronically prior to accessing the online survey in the first round. Invitees were also asked to nominate other colleagues who were responsible for ECG teaching of undergraduate medical students and/or work closely with junior doctors at the academic institutions or hospitals that are considered intern training sites.

Participants were only included as part of the expert panel if they fulfilled all of the following criteria:

  • Participants had to be content experts (i.e. have specialist level knowledge of electrocardiography and/or medical education). Therefore, we included participants if they were either

    • registered as a specialist with the Health Professions Council of South Africa (HPCSA) in Cardiology, Internal Medicine, Emergency Medicine or Family Medicine and practised in an environment that required regular ECG analysis and interpretation (i.e. coronary care unit, medical wards, outpatient department, and/or emergency unit), or

    • a qualified medical doctor with a postgraduate qualification or fellowship in medical education

  • Participants required context expertise (i.e. be familiar with the environment in which junior doctors work and/or train in South Africa). We included participants if they were either

    • working in a hospital or clinic where they do ward rounds or review patients with junior doctors (interns, medical officers), or

    • involved in ECG teaching by either giving formal ECG lectures to undergraduate students or reviewing ECGs with junior doctors (interns, medical officers) on ward rounds

In South Africa, medical students undergo six years of undergraduate training before graduating as medical doctors, with the exception of one medical school offering a five-year course. South African undergraduate medical programmes include both pre-clinical and clinical training. Although there is no national or international guideline on which undergraduate ECG training or assessment is based, the eight medical schools in South Africa offer comprehensive undergraduate ECG teaching, as demonstrated in Table 1. Medical students receive formal ECG tuition during pre-clinical (typically second and third year) and clinical training (typically fourth to sixth year) and are exposed to real-life ECG analysis and interpretation during various clinical clerkships. However, each academic institution choses their own curriculum and appoint lecturers (from various departments) who are available and show an interest in the subject. For the most part, ECG competence is assessed by multiple choice questions (MCQ), objective structured clinical examination (OSCE) and as part of clinical examinations.

Table 1 Overview of undergraduate ECG training at the eight South African medical schools

After graduation, South African medical graduates do a two-year internship at an accredited hospital where they practice under supervision. All medical interns rotate through Family Medicine (with dedicated time in Emergency Medicine and Psychiatry), Internal Medicine, Paediatrics, Obstetrics, Orthopaedics, Surgery and Anaesthetics. Although there is little formal ECG training during their internship, they are required to perform and interpret ECGs in most of these rotations. In the third year after graduation, they are compelled to work independently as community service medical officers in the public sector, often at sites where there is limited supervision. Once they have completed this year of community service, they are registered as independent practitioners and are eligible to work in the public or private sector and may then enrol for specialist training.

Delphi survey development

The investigators carefully selected the ECG diagnoses included on the pre-selected list in the first round, by considering the content of undergraduate ECG lectures, suggested and prescribed textbooks for ECG learning [34, 35], as well as a thorough literature search of topics of undergraduate ECG teaching [4, 6, 7, 9, 13, 36,37,38,39,40,41,42,43,44,45,46], as well as postgraduate ECG training [47,48,49,50,51].

Delphi survey administration

The study comprised three rounds of online surveys that were completed by the participants in the study (Fig. 1). The surveys were administered through REDCap (Research Electronic Data Capture), which is a secure (password protected) online database manager hosted at the University of Cape Town (UCT) [52]. Participants had access to the online surveys through an emailed link specific to the survey of each round and unique to the participant. If, after three weeks, no responses were received, reminder emails were sent to all participants who had not yet completed the online survey by that time.

Fig. 1
figure 1

Study flow

The first round of the modified Delphi study

In June 2017, a link to the online survey of the first round was sent to all consenting participants. The survey consisted of directed questions and open-ended questions:

  • directed questions: participants were asked to reply to a set of 5-point Likert-type questions (Supplementary Table 1) using a pre-selected list of topics of instruction (Supplementary Table 2)

  • open-ended questions: participants were given the opportunity to suggest additional ECG diagnoses that were not included in the pre-selected list.

The expert panel continued to nominate other colleagues to also participate in this modified Delphi study throughout the course of the first round. The last of these invitations were sent in May 2018 and the last response to the survey of the first round was received in June 2018.

Analysis of the first round’s results and preparation for the second round

In June 2018, after three weeks of not receiving any new responses from participants, the first round was closed. The investigators subsequently analysed the data collected. The following criteria was used to determine consensus for each ECG diagnosis in the survey:

  • inclusion in the proposed undergraduate ECG curriculum: ≥ 75% of the expert panel indicated that they agreed, or strongly agreed, that a junior doctor should be able to make the ECG diagnosis. These items were removed from the list used in the next round of the modified Delphi study.

  • exclusion from the proposed undergraduate ECG curriculum: ≥ 75% of the expert panel indicated that they disagreed, or strongly disagreed, that a junior doctor should be able to make the ECG diagnosis. These items were removed from the list used in the next round of the modified Delphi study.

The survey in the second round was prepared and consisted of all the items that had not reached consensus, as well as the additional items suggested by the expert panel (Supplementary Table 3).

The second round of the modified Delphi study

In July 2018, a link to the second round’s online survey was sent to all those who participated in the first round of the modified Delphi study. Participants were given collective feedback from the first round. Frequencies of participant responses to each Likert-type question were presented to the participants (Supplementary Table 4), before they completed the Likert-type questions of the second round. After completing all the directed questions (Supplementary Table 1), the expert panel was given the opportunity to comment on the feedback they had seen. The last response for the survey of the second round was received in December 2018.

Analysis of the second round’s results and preparation for the third round

Subsequently, the investigators analysed the data collected from the second round. The same inclusion and exclusion criteria that were used in the first round were applied to the responses to the closed-ended questions. The survey in the third round was prepared and consisted of all the items that did not reach consensus in the second round.

The third round of the modified Delphi study

In May 2019, a link to the online survey of the third round of the modified Delphi study was sent to all those who participated in the first round. Participants were given collective feedback from the second round. Frequencies of participant responses for each Likert-type question were presented to the participants (Supplementary Table 4) before they completed the Likert-type questions of the third round (Supplementary Table 1). The last response to the survey of the third round was received in October 2019.

Analysis of the third round's results

The investigators subsequently analysed the data collected during the third round. From these results, and those of the prior rounds, an undergraduate curriculum could be formulated from the topics of ECG instruction for which consensus was established (i.e. ≥ 75% agreement) amongst the expert panel. Thereafter, a mode was calculated for each item in all the rounds, to indicate the majority of responses amongst the expert panel. A final list of ECG diagnoses was compiled, only including those ECG diagnoses that had a mode of 5 (i.e. most participants voted “strongly agree”) and diagnoses that can only be made by means of an ECG recording.

Qualitative content analysis

Qualitative content analysis was performed by two investigators (CAV, VCB). An inductive approach was used to identify themes and subthemes from the free-text comments made by expert panellists at the end of the second and third rounds of the modified Delphi study [53, 54]. Themes and subthemes were refined through an iterative process of reviewing the panellists’ responses [55]. Disagreement was resolved through discussions with a third investigator (RSM). A deductive approach was used to quantify the frequency in which the themes and subthemes emerged from the feedback by the expert panel [56].

Results

The modified Delphi expert panel

This modified Delphi consisted of a large expert panel (n = 131), with good retention in the second (80.9%) and third rounds (77.1%) respectively (Fig. 2). Of the 249 specialists who were invited to take part, five declined the invitation and 111 did not respond. Two participants consented to take part, but never completed the surveys.

Fig. 2
figure 2

Recruitment and participation

As shown in Table 2, the composition of the expert panel remained stable between the rounds with regards to speciality, years of experience, settings in which the panellists encountered ECGs in their own practice and where they taught ECGs. The panellists had a wide range of expertise (42.8% Internal Medicine, 22.9% Cardiology, 16% Family Medicine, 13.7% Emergency Medicine and 4.6% Health Professions Education). A third of the expert panel had more than 15 years’ experience as academic physicians. Most of the panellists consulted in the emergency department (70.2%) and in-patient wards (66.4%), and more than half (55.7%) interpreted an ECG at least once a day. About two thirds were affiliated to a University as a lecturer or senior lecturer. Whereas only 15.3% of the panel were responsible for large group teaching of ECGs (i.e. lectures), 91.6% were involved in workplace-based teaching (i.e. teaching ECGs on ward rounds, etc.).

Table 2 Composition of the modified Delphi study expert panel

Items that achieved consensus

Of the 53 items on the pre-selected list that was used in in the first round, 46 items (87.0%) reached consensus to be included in an undergraduate curriculum amongst the panellists, during three rounds of the modified Delphi study (Supplementary Table 2). At the end of the first round, the expert panel suggested an additional 76 items to be included in subsequent rounds of the modified Delphi study, of which 34 (44.7%) reached consensus to be included in the curriculum by the end of the final round (Supplementary Table 3). None of the topics reached consensus to be excluded. The outcomes of the first, second and third rounds are presented in Supplementary Tables 5, 6 and 7 respectively, indicating overall agreement amongst the expert panellists, as well as amongst the different specialties separately.

As shown in Table 3, there was consensus amongst the panellists that a new graduate should know the indications for performing an ECG (i.e. chest pain, dyspnoea, palpitations, syncope, depressed level of consciousness), and that they should be au fait with the technical aspects of performing and reporting a 12-lead ECG.

Table 3 Know the indications for performing an ECG, as well as its technical requirements and reporting

There was consensus that medical graduates should be able to perform basic analysis of the ECG (Table 4) and recognise the normal ECG (Table 5). Most panellists strongly agreed that young doctors should be able to diagnose sinus rhythm, sinus arrhythmia, sinus tachycardia and sinus bradycardia. Regarding atrial rhythms, atrial fibrillation and atrial flutter were considered important by most. None of the junctional rhythms reached consensus. The life-threatening ventricular rhythms, i.e. ventricular tachycardia, torsades de pointes and ventricular fibrillation all reached consensus. Conduction abnormalities such as left and right bundle branch block, as well as all the atrioventricular (AV) blocks were considered important. Left and right ventricular hypertrophy reached consensus, as well as transmural (STEMI) and subendocardial ischaemia (NSTEMI). As shown in Table 6, most panellists strongly agreed that medical graduates should be able to recognise ECG features such as AV dissociation and pathological Q waves. Consensus was also reached for the recognition of clinical diagnoses such as pericarditis and electrolyte abnormalities (such as hyperkalaemia) on the ECG. Most panellists strongly agreed that medical graduates should have an approach to regular and irregular, narrow and wide complex tachycardias.

Table 4 Basic ECG analysis
Table 5 Recognition of the normal ECG and abnormal rhythms and waveforms
Table 6 Using the ECG to make or support a diagnosis

Feedback from participants

Feedback was received in free-text form from 25 and 28 participants at the end of the second and third rounds’ surveys respectively (Supplementary Table 8). Themes that emerged from the inductive analysis were issues with curriculum development, knowing when to seek advice, contextualised learning and a recognition of the importance of the work studied in this modified Delphi study (Table 7).

Table 7 The leading themes and subthemes that emerged from the qualitative analysis

Issues with curriculum development

An important sub-theme that emerged under curricular development, was the need for prioritisation of the different topics that are taught in electrocardiography. Students should be taught “the firm basics and emergencies” to ensure that they are able to diagnose conditions that are life-threatening, or often encountered in clinical practice, once they graduate. Expert panellists cautioned against an undergraduate ECG curriculum that is too difficult (i.e. including ECGs that are too complex for the level of training of young graduates) and also voiced their concern of an undergraduate ECG curriculum that is too extensive and covers too much work.

Knowing when to seek advice

Participants advised that students should be encouraged to seek advice from more experienced colleagues when they have diagnostic uncertainty and to be taught how to make use of electronic support, such as smartphone applications (“apps”) as points of reference, in the workplace.

Contextualised learning

It was recommended that ECGs should be taught within a given clinical context. However, with regard to ECG diagnoses, panellists suggested that the focus of an ECG curriculum should be on conditions that can only be diagnosed by an ECG. With regards to workplace teaching, there was a concern that not all the ECG diagnoses recommended by the Delphi study would be encountered in the workplace during student training.

Recognition of the importance of this Delphi study

There was predominantly positive stakeholder engagement. Participants were often appreciative of being invited to be part of the expert panel. On occasion, narratives concerned criticism of the Delphi process with regard to the composition of the panel and the interval between the rounds in the study. However, it was felt that the results of this study should be disseminated, as it would have a positive impact on undergraduate ECG training.

Final list of “must know” ECG diagnoses

Based on the concerns of curriculum overload, we compiled a consolidated list of “must know” diagnoses that can only be made by means of an ECG recording (Table 8).

Table 8 The majority of expert panellists strongly agreed that a junior doctor should be able to make the following ECG diagnoses

Discussion

This modified Delphi study was a first attempt to obtain consensus on an ECG curriculum for medical students. The variable training opportunities offered by medical schools and the lack of national and international guidance for an undergraduate ECG curriculum was the rationale for performing this study. Through an iterative process of systematically measuring agreement amongst ECG experts, 80 topics reached consensus to be included in undergraduate ECG teaching. These topics included the clinical indications and technical aspects of performing and reporting an ECG, basic ECG analysis (rate, rhythm, interval measurements, QRS axis), recognition of the normal ECG, abnormal ECG rhythms and waveforms, and use of the ECG to make or support a clinical diagnosis. From this list of “should know” topics, it was possible to identify 23 “must know” conditions, which are considered as imperative ECG knowledge. These 23 conditions should serve as the core of an undergraduate ECG curriculum, because they encompass important life-threatening conditions (such as ischaemia, ventricular arrhythmias, atrial fibrillation and high degree AV blocks) that can only be diagnosed by means of an ECG, and for which urgent intervention is likely to make a significant difference to outcome.

The validity of the results of any Delphi study depends on the expertise of the panel [16, 26]. Our study consisted of a large expert panel working in a broad range of clinical practice settings. Delphi study literature has cautioned that large expert panels are difficult to manage, with little benefit of better results [16, 57]. Indeed, we did encounter delays in obtaining responses from the expert panel. However, there was a high response rate and little attrition between rounds. Moreover, the positive stakeholder engagement by participants endorsed the importance of the study. As the surveys were done online, the study was a cost-effective way of gathering the opinion of experts [15], and it saved the participants the time and expense of face-to-face meetings [25]. Furthermore, anonymous participation and feedback limited the influence of panel members on each other [15].

Over and above the list of topics that should be taught, the responses by participants in this study highlight several important issues regarding ECG curriculum development. The long list of topics that was suggested, over and above the original pre-selected list, illustrates the tendency for curricular overload and the demand for diagnostic expertise beyond the reach of new medical graduates. Overwhelming novices with ECG content that is “too much” and/or “too difficult” paradoxically results in less learning [58]. It is therefore important that course conveners refrain from overloading students.

A theme that emerged strongly from the feedback by the expert panel was the need for prioritisation within a curriculum (Fig. 3). Despite the concerns of curricular overload, 80 topics of ECG instruction achieved expert consensus. These “should know” topics are proposed to guide undergraduate ECG instruction. ECG lecturers and tutors are discouraged to include “nice to know” topics in undergraduate curricula. However, reducing the list of 80 “should know” topics to a list of 23 “must know” conditions, allows for a core ECG curriculum that does not overwhelm the student. This condensed list is well aligned with the current recommendation in the literature that ECG teaching should focus on enabling medical graduates to safely diagnose life-threatening conditions, so that the emergency management could be promptly implemented [59]. Training that ensures that medical graduates are competent at diagnosing the conditions included in the core ECG curriculum, would therefore allow for safe practice. However, in the event of diagnostic uncertainty, graduates should be encouraged to seek assistance from more senior colleagues. Current medical education opinion is also increasingly recognising the supportive role of information technology in the process of clinical reasoning and diagnosis [60]. The expert panel’s suggestion that smartphone applications be used to support cognitive diagnostic processes in ECG training is well aligned with this opinion.

Fig. 3
figure 3

ECG training priorities

It has been suggested that tuition should be geared towards the understanding of vectors [61], and the basics of electrocardiography [62,63,64]. If students are familiar with the features of a normal ECG, they may be more able to identify abnormal rhythms and waveforms by means of analysis and pattern recognition [65,66,67].

The need for clinically contextualised ECG training was reaffirmed by this study. This observation is consistent with previous reports that students and clinicians make more accurate ECG diagnoses when the clinical context is known [38, 68]. While this underscores the importance of learning in the workplace, our modified Delphi study identified ECGs that may not be routinely observed in clinical training settings. The participants therefore expressed concern that undergraduate ECG training must be comprehensive and not driven by opportunistic learning encounters only [69, 70].

Lessons learnt from this modified Delphi study

Although expert consensus on an undergraduate ECG curriculum could be derived from the quantitative data collection, the modified Delphi process also allowed for the collection of qualitative data, which helped to put the results of this study in perspective. The quantitative results should therefore not be appraised in isolation or seen as the final arbiter, but rather be considered along with the important remarks by the expert panellists as highlighted by the qualitative content analysis. The suggestions from the quantitative and qualitative analyses should also be implemented according to local context.

The large expert panel’s enthusiasm to participate in this study highlighted their acceptance of the Delphi technique as an appropriate means of establishing consensus. The low attrition rate (despite the iterative rounds of the Delphi study) testifies to the inclination towards an expert consensus document for undergraduate ECG training.

Study limitations

Although this modified Delphi study was conducted in only one country, it does represent a broad spectrum of opinion amongst a large group of specialists engaged in undergraduate ECG education and is, therefore, worthy of consideration in the international community. The proposed list of 23 “must know” conditions, consisting of life-threatening and commonly encountered conditions, is applicable to medical school training in any part of the world, including those where specialist training commences straight after undergraduate studies. The list encompasses conditions that are commonly encountered by clinicians, not limited to those who only work in Cardiology or Internal Medicine. For example, a septic and dehydrated patient awaiting bowel surgery is at high risk of developing atrial fibrillation; or a femoral neck fracture might be the result of a syncopal event associated with third degree AV block.

A limitation to this Delphi study is the absence of Anaesthetists and Paediatricians on the expert panel for devising this undergraduate ECG curriculum. These groups of clinicians, and potentially others, should be involved in future Delphi studies for the development of ECG curricula tailored to their practice.

This modified Delphi study established consensus for a list of conditions that we propose for the tuition of medical students. These topics are apt for pre-clinical and clinical phases of training of medical students. However, this study did not aim to achieve consensus on the teaching modalities that should be used for ECG instruction or assessment of ECG competence.

Conclusion

We have identified undergraduate ECG teaching priorities by means of a modified Delphi study with an expert panel that consisted of specialists with a wide range of expertise. Instead of teaching long lists and complex conditions, we propose focusing on the basics of electrocardiography, life-threatening arrhythmias and waveforms, as well as conditions commonly encountered in daily practice.

Glossary terms

  • ‘ECG analysis’ refers to the detailed examination of the ECG tracing, which requires the measurement of intervals and the evaluation of the rhythm and each waveform [8]

  • ‘ECG competence’ refers to the ability to accurately analyse as well as interpret the ECG [8]

  • ‘ECG interpretation’ refers to the conclusion reached after careful ECG analysis, i.e. making a diagnosis of an arrhythmia, or ischaemia, etc. [61]

  • ‘ECG knowledge’ refers to the understanding of ECG concepts, e.g. knowing that transmural ischaemia or pericarditis can cause ST-segment elevation [71, 72]

  • An ‘Entrustable Professional Activity’ (EPA) is a task of every day clinical practice that could be delegated to a medical school graduate as soon as they can perform the task competently and unsupervised [73, 74].