Introduction

Evidence-based practice (EBP) is where clinicians participate in the treatment decision-making process informed by the best available evidence, complemented by clinician expertise and patient preferences.1 EBP is a fundamental element in delivering contemporaneous, high-quality care for patients.2 Using evidence to inform a clinical question often simply involves access, appraisal and use of existing guidelines. The clinical decision resulting from this process should be, as described in the Sicily statement, 'made by those receiving care, informed by the tacit and explicit knowledge of those providing care, within the context of available resources.'3 Hence, the appropriate understanding and use of each of these components is key. However, sometimes there is a need to go beyond that and carry out a systematic search for relevant evidence, critically appraise that evidence and its relevance to the situation, then applying that to clinical practice and finally evaluating the clinical outcome and the process.4,5

In relation to dentistry, EBP is one of the professional competencies required by the General Dental Council (GDC)6 and is a key learning outcome of the undergraduate curriculum.

However, the literature shows a disparity in dental practitioners' behaviour and application of the available evidence to their routine practice. For example, in 2012, a study investigated new dental graduates' (NDGs') understanding and use of National Institute for Health and Care Excellence (NICE) recommendations on antimicrobial prophylaxis for patients at high risk for infection.7 They found that around 30% of NDGs had not read this guideline or its summary, albeit the majority had been made aware of it during their undergraduate training. While a similar cohort of NDGs in 20188 believed that guideline recommendations play an essential role in their decision-making process. A recent systematic review noting these variabilities in guideline compliance suggested this may be due to diversity in study setting, design and target population.9 An exploration of dentists' behaviours towards delivering evidence-based preventive care in primary dental care concluded that several attributes could influence their relationship with, and use of, EBP. These attributes may be either at person-level, such as knowledge (ie awareness or familiarity), attitude and confidence in one's skills, or at context-level, related to the environment, time and financial resources.10 Little is known about whether NDGs in the UK have the attributes required to apply EBP to their professional practice and if so, whether these change in response to their new context; the environment of professional practice and a business environment.

Aim

This study was part of a wider investigation into NDGs' transition to practice and aimed to explore changes in NDGs' relationship and engagement with EBP during their transition into professional practice by investigating:

  • Perceived self-efficacy

  • EBP knowledge

  • Attitudes towards EBP and its value for delivering quality care

  • Confidence in critical appraisal skills

  • Frequency of accessing evidence.

Method

Ethical considerations

Ethical approval was obtained through the University of Dundee Schools of Nursing and Health Sciences and Dentistry Ethics Committee, Number 2018009. Participant consent was implied by completion and submission of the questionnaire.

Study design

This was a longitudinal, self-administered, questionnaire-based study. Data were collected electronically at two time-points: upon graduation (R1) (May 2018), when NDGs had just passed their final exams but had not started vocational dental training (VDT) and when participants had spent six to nine months as vocational dental practitioners (VDPs) (R2) (February to May 2019).

Participants

All NDGs graduating in 2018 within one dental school (n = 66) were invited to participate and respond to the questionnaire.

Survey instrument

The survey consisted of two pre-validated questionnaires11,12 and three clinical scenarios (see online Supplementary Information) in four sections of 58 items.

Section one

Demographics: sex, age category, details of previous degree (if applicable) and NDGs' familiarity with current clinical guidelines.

Section two

Self-perceived self-efficacy: evidence-based practice confidence scale (EPIC).11 Participants rated their level of self-efficacy to adopt EBP into their practice on a scale of 11 points ranging from 0% (not self-efficacious) to 100% (completely self-efficacious).

Section three

Clinical knowledge in relation to a 'gold standard' was investigated through responses to clinical scenarios related to paediatric dentistry (managing carious lesions and recommending recall intervals), taken from, or aligned to the current guidelines that were taught within the NDGs' dental school curriculum.13,14

Section four

Knowledge level, self-perceived attitude, confidence in critical appraisal skills and frequency of accessing evidence: these EBP-related domains were assessed through the knowledge, attitude, confidence and accessing EBP resources (KACE) survey tool,12 comprising 35 items distributed across four categories:

  1. 1.

    Knowledge was assessed in ten questions, with a single best answer and an 'I don't know' option

  2. 2.

    Attitudes towards EBP were measured with levels of agreement (five options ranging from strongly disagree to strongly agree) for ten statements covering different areas and attitudes

  3. 3.

    Confidence in critical appraisal skills was measured using a five-point rating scale ranging from 'not at all confident' to 'very confident'. The domain consisted of six items reflecting different aspects of appraising published research design and reporting quality

  4. 4.

    Behaviour around accessing evidence was evaluated through participants rating the frequency with which they accessed various evidence sources.

Participant recruitment

NDGs were approached for voluntary participation through their university email addresses (n = 66). The email had three sections: an introduction, a request (optional) for a personal email to send the R2 questionnaire to and the questionnaire. Three weeks after the target population were initially contacted, a second reminder was sent. R2 questionnaires were sent to the participants who submitted their responses and shared their personal email at R1.

Data management and analysis

The anonymous questionnaire responses were compiled with decimals rounded to the nearest whole number. The results were reported on two levels: domain level (all the statements related to each domain) and item level (one statement). Item-level analyses and reporting: descriptive analysis was used to portray the changes in the participants' perceptions over time. Binary data results, mainly 'correct and incorrect' answers, were displayed in bar graphs as percentages for the correct answers for each survey, with median values, the 25th percentile (Q1) to 75th percentile (Q3) and interquartile ranges, generated using SPSS Statistics and Microsoft Excel 2019. For the findings at domain level, since Likert-scale data are ordinal in nature,15 the Mann-Whitney U (MWU) non-parametric test was employed to compare the differences in median between R1 and R2 results of the same domain.16 The null hypothesis was that the two cohorts were equal with no differences. The alpha value was set at 0.05.

Results

There were 66 NDGs invited to participate. At R1, 34 (52%) completed the survey and at R2, 21 (62% of the participants who provided personal emails at R1) and 32% overall. The histogram normality plot indicated that the data were normally distributed. Table 1 shows the participants' demographic characteristics.

Table 1 Participants' characteristics for both questionnaire survey rounds. R1 (n = 34) and R2 (n = 21). Based on valid responses and rounded to nearest %

For all questions related to their familiarity with current guidelines and related concepts, the majority of participants thought they were familiar with the concepts and 76% (n = 26) overall felt familiar at R1 and 90% (n = 19) at R2 (Table 2).

Table 2 Participants' perceptions of their familiarity with current clinical guidelines and other principles around searching for evidence. R1 (n = 34) and R2 (n = 21). Based on valid responses and rounded to nearest %

Overview of the assessed domains

A trend was noted in a reduction in the median scores across all domains between R1 and R2 (Table 3). The MWU test found only participants' attitudes towards EBP, their confidence and self-reported access to reliable evidence resources showed evidence of a statistically significant reduction over time (p = 0.01, 0.05 and 0.02 for each domain, respectively). There was no statistical significance in the differences for the 'self-efficacy' domain (p = 0.8) and 'knowledge' (p = 0.07).

Table 3 Median, Q1-Q3 values and statistical significance of changes over time of the overarching domains of the EPIC and KACE scales. R1 (n = 34) and R2 (n = 21). Based on valid responses and rounded to nearest whole number

Self-efficacy (EPIC scale)

At a domain level, participants' scores for their perceived self-efficacy to practise in line with the latest available evidence decreased over time. However, the differences in medians between the two survey time points were not statistically significant. In terms of the items level, participants reported lower self-efficacy at R2 in seven items and higher in three items. The median score for one item related to formulating a PICO question (patient, intervention, comparison and outcomes) did not change over time. Detailed results can be found in the online Supplementary Table 1.

Knowledge of current guidelines (gold standards)

When knowledge was tested, the median percentage of the correct responses from all scenarios was around the midpoint (47%) for both rounds. There was no evidence of improvement or reduction in participants' knowledge level around guideline recommendations over time.

Knowledge of evidence-based practice principles (KACE scale)

At a domain level, there was no evidence of statistically significant differences between the R1 and the R2 responses. Participants' knowledge of EBP concepts was below the midpoint of the domain scale (median scores out of possible ten [IQ1-IQ3]: R1 = 4 [2.5-4]; R2 = 3 [2.8-3]; p = 0.07).

At an items level and as Figure 1 shows, all ten questions showed a reduction in the percentage of the correct answers across rounds. The difference between the two survey episodes was more evident in the areas around participants' knowledge of the level of evidence as 65% (n = 22) of the R1 participants recognised 'Cochrane review' was the highest level of evidence within the given options. Only 29% (n = 6) of respondents selected the correct answer at R2 (Fig. 1).

Fig. 1
figure 2

Percentage of the respondents who correctly answered the knowledge domain questions

Attitude (KACE scale)

At the domain level, there was a statistically significant reduction in the NDGs' attitude towards EBP (p = 0.01). This reduction can be seen at an items level as well. For participants' attitudes, the median scores at R1 were 'agree' in nine out of the ten statements but moved to 'uncertain' for eight of these in R2 (online Supplementary Table 2).

Confidence in critical appraisal skills (KACE scale)

For individual items, the median scores of three items were higher at R1 and there was no change in the other three items at both rounds. Assessing the generalisability of study findings was an area where most respondents at R1 rated their confidence level as low (not confident). Participants at R2 were even less confident in this skill (median score 'not at all confident'). The highest median score (indicating higher confidence level) was associated with perceived confidence in assessing the value of the research report. Participants believed that they were 'fairly confident' at R1 and 'confident' at R2 in performing this EBP-related skill (online Supplementary Table 3).

Accessing evidence (KACE scale)

At an individual items level, colleagues/VDT trainers were considered the main sources of evidence for the majority of the participants at R1. This did not change at R2 as the median score was 4 out of possible 5 for both rounds. Colleagues/VDT trainers as evidence sources were followed by the internet. Many of the participants at R1 reported that they 'often' used this source of evidence and 'occasionally' at R2 (median R1 = 4, R2 = 3) (online Supplementary Table 4). Most participants responded that they 'rarely' consulted 'research papers published in peer-reviewed journals' to look for evidence at R1 and 'never' at R2. Evidence-based dentistry journals were also reported to be checked 'rarely' by the participants at both rounds.

Discussion

This study assessed EBP-related attributes of a cohort of NDGs at two timepoints, to gain a better understanding of how their relationship may change with this dimension of their clinical practice. These attributes included: self-efficacy, knowledge, attitude, confidence and accessing evidence and self-efficacy. The time-points were chosen because at R1, the target population were considered to have met the undergraduates programme standards, reflecting the GDC's competency levels for 'safe beginners' but had not started their VDT.6 While at R2, NDGs had been exposed to professional practice, VDT and its potential limitations and opportunities. No calculations were conducted to determine the sample size as the entire population from one undergraduate programme were targeted. In addition, approaching other dental schools would have introduced other variables which could make the findings less clear.

The questionnaire results of both rounds suggest that all the measured domains decreased by varying degrees when they were assessed at R2 compared to R1. Questionnaire findings indicate that 'time' after graduation and being in professional practice appear to have had an adverse influence on the new graduates' relationship with EBP. This finding may be in line with the systematic review results suggested by Choudhry and colleagues who concluded that practitioners' years in practice was inversely associated with their knowledge and adherence to available guidelines.17 This negative change over time may be due to only having theoretical engagement with EBP principles during undergraduate training which lack context, rather than a deep appreciation of its role or relevance to professional practice. It could be that it is engaged with only to the extent that students satisfy the lower levels of Bloom's taxonomy, such as knowledge and comprehension, so can pass exams. However, at the point where this more superficial learning is applied and reinforced through repetition, that is, brought into their practice with them using it meaningfully, the environment they are in either fails to support those actions required to do this, or it is conveyed as being unnecessary. This means the use of EBP by the NDGs declines further as time goes by. This suggestion is supported by the fact that 'seeking colleagues' opinions' and 'casual internet browsing' were common places for them to look for evidence. While the experience of colleagues can be a valuable resource, these may be influenced by personal bias or may not be up to date, especially if they have had the same experiences of EBP as the new dentists and they have relied on other practitioners. The decline of the participants' attributes may be at odds with the goal of the VDP training scheme to ensure dental graduates 'have developed into competent, caring, reflective practitioners'.18

The questionnaire's clinical scenarios explored dental graduates' knowledge of the current, local guidelines to inform their clinical judgement. There were a higher proportion of correct responses (based on current guidelines) compared to the knowledge domain of the KACE scale. It is interesting to note that even for some correct answers, the rationale given to explain them was not necessarily accurate. It could be that dental graduates have a relatively better understanding of the 'gold standards' that are related to their clinical practice, since they are 'ready-to-use' sources of evidence, without the need to use critical appraisal skills. Participants' mixed views on their confidence in evaluating evidence and their low self-efficacy in interpreting study results using various statistical procedures reinforce these findings.

Knowledge of EBP concepts upon graduation did not score highly (collective median score for R1: 4 out of possible 10). 'Explain, evaluate and apply the principles of an evidence-based approach to learning, clinical and professional practice and decision making' is one of the GDC's learning outcomes required from dentists upon registration.6 Clinical practice changes with evidence creation and circumstance (COVID-19 is an extreme example). Competence in EBP is not merely limited to the use of readily available recommendations within clinical guidelines. It includes acquisition, assessment and implementation of scientific evidence in practice to keep the profession current and practising in line with new developments. This requires knowledge and use of EBP.

Interestingly, even though participants had limited EBP knowledge, they appear to value EBP and acknowledge its role in improving standards of care, albeit they are conscious that practising according to the latest evidence is not always possible. However, this positive stance towards EBP was not translated into actual practice, as the vast majority of the NDGs would consult a colleague or casually search the internet, rather than seeking a more reliable source of information. This EBP practice pattern was also noticed by Iqbal and Glenny almost two decades ago, when they identified time, funding and difficulty accessing evidence as the main barriers for these findings.19

The findings of this study highlight other possible issues related to whether the NDGs were appropriately trained or supported to be lifelong learners in both roles. Another area to consider is the complexity of the process of looking at 'appropriate' evidence in the midst of an ever-increasing number of research articles. Hence, the issue of adopting EBP can be more challenging than it seems. More research is needed to assess whether achieving the competency related to EBP at university carries forward into a career-long approach to consulting evidence and applying it appropriately to practice.

This study had some limitations. Two questionnaire instruments (KACE and EPIC scale) were employed to collect data from the participants. These instruments were selected because the psychometric properties were previously tested in a similar context. However, combining those two instruments made the questionnaire long, which may have introduced response bias. Another source of response bias might be with only those who valued EBP participating. The cohort of this study was drawn from one dental school. Hence, conclusions drawn are context-related, albeit there is no reason to infer that the participants of this study differ from NDGs of other dental schools in the UK. All curricula are based on the same GDC learning outcomes but vary in teaching style, which may result in variable levels of knowledge, confidence and attitudes among graduates. Investigating NDGs from different dental schools is therefore recommended to attain more generalisable conclusions. Finally, the data were collected anonymously and analysed on a cohort level rather than on an individual participant basis. This meant that participants could not be tracked and characterising individuals who dropped-out and didn't complete the questionnaire at R2 was not possible.

Conclusion

After six months in professional practice, NDGs showed statistically significant reductions in the value they place on EBP, their EBP-related skills and their use of reliable evidence sources. There were no differences in their 'self-efficacy', knowledge of EBP principles or awareness of taught gold standards. NDGs also found fellow dentists and casual internet browsing to be acceptable alternatives to formal evidence searching and evaluation. This could prevent the profession from moving forwards by slowing down the adoption of modern, evidence-based approaches.

Concerningly, NDGs' demonstrable knowledge of EBP principles was questionable even upon graduation in comparison to their undergraduate competency level, which was verified by the dental school through the use of various assessment means throughout the undergraduate programme. Further research should explore possible reasons for these findings and suggest solutions for improving use of EBP.