Introduction

The pharmacy educational system in South Korea was reformed to a six-year (2 + 4) program in 2009. The major change comprised introducing pharmacy practice experiences to cultivate students’ competencies by enabling them to acquire knowledge, skills, and attitude to perform a pharmacist’s role in various practical fields [1]. The experiential education program consists of two phases: the introductory pharmacy practice experience (IPPE) courses for 60 h, where students are exposed to simulated pharmacy practice environments within the college of pharmacy, and the advanced pharmacy practice experience (APPE) courses of training in hospital and community pharmacy track, industrial/administrative pharmacy track, or pharmacy research track for 1340 h [2, 3].

The assessment of pharmacy students’ readiness to begin APPE education in clinical pharmacy settings continues to gain increasing attention [4, 5]. The Accreditation Council for Pharmacy Education (ACPE) in the United States (US) has emphasized the importance of competency assessment with comprehensive, formative, and summative testing [6, 7]. Since pharmacy practice training was first implemented eighth years ago in South Korea, preceptors and students have raised concerns related to experiential education, such as differences in IPPE educational content and quality among 37 colleges of pharmacy and differences in students’ competence in translating knowledge levels into practice [8, 9]. Despite the apparent need for a competency assessment program to assess students’ readiness for experiential learning, there are no established standardized examinations or evaluation criteria to assess students’ clinical performance consistently and accurately.

Objective structured clinical examination (OSCE) was first introduced as a novel method of assessing the clinical competence of medical students and has been adapted to numerous other health professional programs, including doctor of pharmacy curricula. The OSCE is advantageous in evaluating competency in difficult-to-assess areas, such as communication, problem solving, and decision-making, with relatively high reliability, validity, and objectivity [10, 11]. In the US and Canada, the OSCE has been implemented in doctor of pharmacy programs or national pharmacist licensure examinations to evaluate whether pharmacy students have the necessary knowledge, skills, and attitudes for clinical practice [12, 13]. Standardized OSCE models have been developed in countries including the US, the United Kingdom (UK), Canada, Japan, Malaysia, and the Middle East, for examining the progress in assessing students’ readiness for clinical practice [5, 14,15,16,17,18,19,20]. However, there is no generalized assessment program to determine the performance readiness of pharmacy students prior to beginning APPEs. The OSCEs have been developed in the US for assessing competencies acquired during an IPPE [5, 15]. Nevertheless, it is difficult to apply it for the outcome assessment of education only with in-class simulation system, since the IPPEs in the US includes the off-campus training [2, 6, 7]. This study aimed to develop an OSCE in the core domains acquired through an IPPE, to evaluate its appropriateness as a tool for assessing clinical pharmacist competency for APPEs in Korean pharmacy students, throughout a pilot study.

Methods

OSCE development

To establish the OSCE blueprint, we set its core values as human dignity, professionalism, and social responsibility (Fig. 1) [6,7,8]. The OSCE’s core competency domains demonstrated by pharmacy students who completed the IPPE courses, were selected through review of literature related to OSCEs for pharmacy students and pharmacists’ practice examination in the US, the UK, Canada, and Japan [3,4,5,6,7,8, 12,13,14,15,16,17,18,19,20,21,22,23,24], ideation by researchers, and group discussions with experts. We primarily referenced duties for clinical performance test, suggested by Han et al., and the Korean official textbook of IPPE [22, 25].

Fig. 1
figure 1

A flowchart for OSCE development

To develop OSCE cases related to each OSCE competency domain, we identified case objectives and explored possible case scenarios related to each OSCE topic based on the textbooks of IPPE and pharmacotherapy used in 37 colleges of pharmacy in Korea [25, 26] and ideation by the researchers. Subsequently, we finalized the simulated case scenarios and assessment criteria for the clinical performance and communication skills of the students within the given time constraints (i.e., 10 min for each case) through review by external experts qualified for the education of clinical pharmacy and pharmacy practice. They reviewed the OSCE cases and competency criteria to achieve a consensus by the Delphi method [27]. The case scenarios consisted of the title, interactive/non-interactive, purpose of the OSCE, time, materials, instructions for students and questions, instructions for standardized patients/physicians, and instructions for assessors (i.e., answer and assessment criteria). The instructions for standardized patients/physicians contained a specific script with an information guide on the reactions of standardized actors to students’ responses. Development of an assessment criteria along with a scoring rubric, helped evaluate clinical performance skills of pharmacy students, such as critical thinking, patient-centered problem solving, overall attitude and behavior, and provision of correct information, as well as their communication skills, according to each OSCE topic [6,7,8, 25, 28].

Setting and subjects

A pilot study was designed as a prospective single-arm observational trial to evaluate the clinical competency of pharmacy students in South Korea prior to APPEs by implementing the OSCE models developed in this study. The inclusion criterion for participants was third-year students of pharmacy schools (i.e., fifth-grade at South Korean pharmacy colleges), who had completed 60 h of IPPE. They were recruited via flyers posted on the website of the Korean Association of Pharmacy Education and four colleges of pharmacy located in Daegu and Gyeongsangbuk-do, South Korea. Assessors who were preceptors of APPEs or clinical faculty members at pharmacy colleges with at least two years of IPPE or APPE educational experience were asked to participate in the study by the researchers to assess the clinical pharmacist competency of the students. On the day for the pilot test, all students or assessors attended their own respective briefing session. They were informed about the OSCE procedure, the total expected time for the exam or assessment, number of simulated cases, and stations with standardized patients/physicians, as well as the survey after the OSCE’s completion. The competency evaluation criteria and rubric scoring methods were explained to the assessors. All participants submitted a written spontaneous participation consent before enrolling in the study.

This pilot study was conducted at Keimyung University’s College of Pharmacy in South Korea, on September 26, 2020. Each core domain of the OSCE was examined at five stations, using a mock pharmacy and standardized patients/physicians with simulated tasks or problems. Prior to the day of the OSCE, four actors serving as patients or physicians received a 30-min training from the primary investigators, along with scripts for the simulation. On the day for the pilot test, students were randomly assigned to four groups of five, and took the exam at five stations for each OSCE case. Each case required 10 min, that is, two minutes of preparation time in front of the station, seven minutes of test time, and one minute of travel time to the next station. A research assistant in each station managed the time schedule, and two coordinators facilitated the overall OSCE process. A small amount of allowance for research participation was paid to all participants (i.e., students or assessors), standardized patients/physicians, coordinators, and assistants.

Assessors and scoring

The assessors were provided with competency assessment mark sheets with a binary grading system (i.e., pass or fail) developed according to each OSCE case in this study. Four assessors assigned to each station based on the peer investigator's judgement, evaluated the competencies of all students as per the provided criteria. The pass-fail grading system is appropriate to assess the clinical simulation performance of the healthcare students [29, 30]. The assessors were provided with specific exemplar answers for each evaluation criterion to make their assessment similar. For each evaluation criterion, the students received either a pass or a fail on their clinical performance or communication skills, from the assessors with the following standards: (1) a pass when meeting 50% or more of each assessment criterion for clinical performance and communication skills and (2) a fail when meeting less than 50% of each assessment criterion. Consequently, the students and assessors conducted surveys with a four-point Likert scale on the difficulty, usefulness, and satisfaction of each OSCE competency area. Students were informed that they could individually attain their assessment results if desired.

Analysis

Data are shown as numbers and percentages for categorical variables, and as means and standard deviations (SD) for continuous data. The difficulty of the OSCEs was evaluated based on the students’ performance rates which were measured by the percentage of students who passed each assessment criterion. The OSCE was considered difficult if the performance rate was less than 40% and easy if the rate was greater than 80% [31, 32]. Fisher’s exact and chi-square tests were used to compare the clinical performance assessment results between professors and hospital/community pharmacists as well as survey results between students and assessors. Statistical significance was set at a two-sided p-value of < 0.05, and data analysis and computation were conducted using SPSS Statistics (version 22.0; IBM, Armonk, NY, USA).

Results

Development of OSCE’s core competency domains and cases

We finally determined five core competency domains for the assessment of pharmacy students’ readiness for the APPEs, which were: (1) patient counseling; (2) prescription review; (3) provision of drug information; (4) over-the-counter (OTC) counseling; and (5) pharmaceutical care service. The prescription review area was selected instead of the drug preparation and dispensing areas suggested by Han et al. [22], since researchers determined that the competencies related to dispensing should be evaluated after the APPEs [25]. The detailed topics and general assessment criteria related to each OSCE’s core competency domain are listed in Table 1. Of the five competency domains, four were interactive examinations involving standardized patients/physicians, and only the prescription review area was a non-interactive one, where the student had to solve problems and present them to the assessors. Summaries of the simulated case scenarios for the five competency areas are presented in Table 2. For example, the objectives of the patient counseling case were to counsel and educate a patient with asthma on a dry-powder inhaler. This station simulated a 25-year-old man who came to the pharmacy with a prescription for a fluticasone inhaler, and the student was asked to provide appropriate patient education and counseling. A four-level rubric was set to score students’ competency in the OSCE, including (1) outstanding for the achievement of 90% or more, (2) clear pass for 70 − 89%, (3) borderline pass for 50 − 69%, and (4) clear failure for less than 50%.

Table 1 Topics and assessment criteria of the OSCE core competency domains
Table 2 Summaries of the cases in each OSCE core competency domain

Implementation of the OSCE in a pilot test

Twenty students from two pharmacy colleges located in Daegu and Gyeongsangbuk-do were included. As shown in Table 3, there were also 20 evaluators from community and hospital pharmacies and colleges of pharmacy with at least two years of educational experience, either in the IPPEs or APPEs.

Table 3 Demographic characteristics of the competency assessors of pharmacy students (n = 20)

Table 4 demonstrated that the overall performance rate of the students was 50.8%. The average performance rates were 32.1, 64.8, 65.4, 79.7, and 62.5% in patient counseling, prescription review, provision of drug information, OTC counseling, and pharmaceutical care service, respectively. Among the 18 assessment criteria in patient counseling, the students had a performance rate of less than 40% for the 11 criteria, and no one met the following two criteria: “Explain expected length of the patient counseling session” and “Allow patients to summarize and organize relevant education and counseling.” Among the 14 criteria in the prescription review area, the students’ performance was less than 40% for the following three criteria: “Note that the physician’s signature on the prescription is missing”; “Note that the prescription expiration date is missing”; and “Check the patient’s age.” In drug information provision, students’ performance was relatively low only for two of the six assessment criteria: “Reconfirm the question clearly” and “Provide proper information on naproxen co-administration.” The students’ performance was more than 50% for all four criteria in the OTC counseling. In pharmaceutical care service, students’ performance was low only for the assessment criterion, “The converted daily dose of oral prednisolone was designed using the appropriate number of tablets and frequency, considering the formulation and dose of those on the market.” among the total four criteria. The evaluation results of professors and hospital/community pharmacists differed statistically significantly in five of the six criteria in the drug information area, while only two of the 14 criteria were different in the prescription review area.

Table 4 Assessment criteria for clinical performance skills in each OSCE area and assessment results in a pilot study

As shown in Table 5, the average performance rate in the seven assessment criteria for the students’ communication skills was 60.4%. However, only three criteria were passed by more than 80% of the students: “Use terms and expressions while considering the partner’s capacity for understanding.”; “Adhere to appropriate speech and an attitude that makes the partner (patient or healthcare professional) feel comfortable.”; and “Explain or respond with respect to the partner (patient or healthcare professional).”.

Table 5 Assessment criteria for communication skills in OSCEs and assessment results in a pilot study

Table 6 shows the results of the students’ and assessors’ surveys after the OSCE was implemented. The students and assessors thought that the time for each session of the OSCE was sufficient, the questions were appropriate for evaluating the clinical performance of the students prior to the APPEs, and standardized OSCEs were necessary for the evaluation of students’ clinical pharmacist competencies acquired in the IPPEs. In addition, all of them agreed that students’ clinical performance would improve if OSCEs were conducted in the future. However, statistically, more students than assessors answered that the examinations for prescription review and OTC counseling were more difficult.

Table 6 Ratings of students and assessors for OSCEs in a survey with a four-point Likert scale

Discussion

As there is a need to improve healthcare services in Korea, pharmacy colleges have tried to develop methods to evaluate students’ practical skills and performance [8, 22]. This pilot study showed that application of the OSCE to Korean students completed in-class pharmacy simulation courses was feasible to assess their competencies and preparedness for the advanced training in community or hospital pharmacies. To the best of our knowledge, this is the first study to apply a standardized OSCE system to pharmacy students who completed 60-h IPPE courses at Korean colleges of pharmacy, in order to evaluate their clinical performance for the APPE curriculum comprehensively and objectively.

This OSCE model can evaluate nine of the 11 pre-APPE core domain competencies of the ACPE: patient safety; basic patient assessment; medication information; identification and assessment of drug related problems; mathematics applied to dose calculation; professional and legal behavior; general communication abilities; counseling patients; and drug information analysis and literature research [6, 7]. Unlike the Korean IPPE curriculum, the one in the US mainly includes off-campus practice in community and hospital pharmacies as along with in-class simulation training [6, 7]. The Pharmaceutical Common Achievement Tests Organization in Japan presented five OSCE competency areas, including patient counseling, dispensing, dispensing audit, aseptic dispensing, and provision of drug information [17]. The OSCE core domains in this study areas were developed after considering the IPPE curriculum and its official textbook, utilized by most Korean pharmacy schools [3, 25]. Counselling patients for complex dosage forms (i.e., respiratory inhalers or self-injection devices) or OTCs, has been a key competency for clinical pharmacists to provide effective health and medication information to patients, and confirm their understanding of it [6, 7, 25, 33]. By assessing students’ ability towards patient care and prescription review, we could evaluate their basic knowledge, critical thinking, and problem-solving competencies, for assessing patient conditions and DRPs in the community or hospital pharmacy. The competency of time bound drug information analysis and literature research, could be assessed by the area of drug information service, which required the use of adequate drug information resources and evidence-based pharmacotherapy, to provide safe and effective pharmacotherapy [6, 7, 15, 25, 26].

The OSCE stations with standardized patients or physicians were appropriate since pharmacy students have been recommended to complete a specific clinical task often in an interactive environment [21, 33, 34]. The students’ average performance was the lowest at 32.1% in the case of counseling the patient with the inhaler, and the highest at 79.7% in the OTC counseling. This might indicate towards the insufficient readiness of students, for counseling patients with prescribed inhalers at the community or hospital pharmacies. Contradictorily, the students found the cases related to prescription review and pharmaceutical care service, as well as patient counseling, difficult. In Korean pharmacy schools, the IPPE curriculum is operated as in-class simulation of prescription review, dispensing, medication therapy management, patient counseling, and drug information provision, while the APPE courses are conducted as field training at community or hospital pharmacies [2, 3]. Since participating students had not yet started APPE courses, the OSCE cases proved difficult, which resulted in their performance rate dropping below 80%, in certain criteria of all OSCE areas. Malaysian pharmacy students also considered the OSCE areas related to patient counseling, drug dosage review, and drug information service relatively difficult, compared to the areas related to drug-related problems or pharmacokinetics [19]. Despite pharmacists being required to counsel patients within the expected duration, and verify patients’ medication knowledge according to the pharmacist-conducted patient counseling guidelines and the textbook used in the Korean college of pharmacy, no student met the relevant assessment criteria [25, 34, 35]. This study also showed that students portrayed weaknesses at the beginning and end of the communication in clinical pharmacy practice. Contradictorily, Japanese students showed excellent outcomes in most communication skill areas, which was probably affected by the list of tasks provided a minute before the advanced OSCE [36]. The standardized IPPE curriculum applied to all colleges of pharmacy in South Korea is limited. It was reported that the incorporation of simulation based IPPE made pharmacy students more confident on technical and communication skills, and more aware of medication errors and other patient safety issues [15]. Therefore, Korean pharmacy colleges’ IPPE education should strengthen their curriculum based on simulation education, for applying the knowledge to actual clinical situations related to the five key competency areas, and involve the preceptors as reviewers to reduce the differences in the outcome assessment.

The pharmacy students, faculty, and preceptors, need to be introduced to the OSCE system. Implementation of OSCEs as part of the evaluation of clinical performance could help students improve their capabilities by identifying their current level via the assessment of their performances [14, 37]. In a study with third-year pharmacy students in the US, the OSCE was also found to be a means to evaluate students’ clinical capabilities obtained through IPPE practices [2, 5, 7]. Therefore, additional cases related to each OSCE core domain should be developed with the validated assessment criteria through continuous discussions between the pharmacy colleges’ faculty members and APPEs' preceptors. It is also necessary to standardize the OSCE content and lay the foundations for the OSCE introduction by referring to the OSCE system of other healthcare professionals to integrate the OSCE into the pharmacy curriculum. This might be ensured through the development of guidelines, which include details of the time of OSCE, eligibility, management of students, the execution of the exam (i.e., time, location, duration, etc.), assessment methods, and continuous quality management as well as the development and confidentiality of the OSCE cases/questions, by referring to this pilot study [21, 34]. However, the OSCE's adoption should consider an enormous budget allocation for space, administrative overhead, and faculty members’ time [13]. Further studies are needed to find a cost-effective way of introducing the OSCE in Korean pharmacy educational system.

Although several countries such as the US, Canada, Australia, the United Kingdom and Japan have used the OSCE in various ways for evaluating clinical competencies of pharmacy students, most pharmacy schools around the world have not yet introduced or are preparing to introduce the OSCE in their pharmaceutical education systems [21, 38]. Therefore, other countries or organizations could refer to the OSCE model developed in this study to develop or improve the OSCE system for competency assessment of students’ readiness for the pharmacy practice experiences in community or hospital pharmacies.

As this is the first study to implement an OSCE in Korea, some problems were encountered during the pilot test. First, only one case was developed for each core competency domain and implemented to a limited number of students in this pilot project, which could confound the OSCE’s assessment outcomes. Thus, it is desirable in the future to develop various simulated case scenarios and questions, and implement the OSCE on a larger number of students. Second, the scoring rubric and questionnaire used in this study lacked proper calibration or validation. Considering that the standardized scoring rubrics are essential for improving the assessment's consistency, further studies are needed to calibrate examiners on the use of rating scales before adopting the OSCE for Korean pharmacy students [28]. Finally, most of the pharmacy students hold insignificant information about the OSCE system, although we explained the overall OSCE procedure to the participants, in a briefing session before the pilot study. Therefore, some students might have difficulties in understanding the simulation based OSCE process and test questions. It was reported that the awareness of the simulated situation made students feel slightly unreal, where only 77% of the students speaking to the simulated patients felt like a real doctor [39]. Moreover, performance anxiety in certain students and examination unfamiliarity in both students and evaluators, probably caused relatively low performance rates in certain domains [40].

Conclusion

The OSCE for patient counseling, prescription review, drug information provision, OTC counseling, and pharmaceutical care services, can be used to assess those pharmacy students’ clinical competencies, who completed the 60-h IPPE course in a pharmacy college. Our pilot study suggests the necessity of an OSCE domain-based difficulty adjustment, and the strengthening of simulation-based IPPE education, through continuous discussion between the pharmacy faculties and preceptors for pharmacy students’ readiness to practice off-campus clinical pharmacy operations. Future studies are needed to validate this observation’s feasibility in large numbers of pharmacy students.