Introduction

A current trend in medical school curricular design is to reduce the hours students spend in passive lecture in favor of active sessions such as small groups, team-based learning, problem-based learning, case-based learning, flipped-classroom models, think-pair-share (TPS), and buzz groups [14]. Sessions such as the above are believed to improve the learning process by enhancing engagement with the material.

The idea is that when students are actively involved in the learning process, retention and the ability to apply the learned material will be enhanced. However, developing active learning techniques in a large lecture class setting can be challenging, and course directors are always striving to adopt new strategies to improve knowledge transfer. This paper describes a method of active learning that combines aspects of TPS and audience response devices (clickers) to engage students in an informal, team-based learning session.

Methods

The Quiz Discuss Compare (QDC) session is simple to run in a lecture hall setting. First, students are presented with a series of multiple-choice questions using clickers but neither the correct answer nor the percentage of students who select the various responses is revealed. Each question is timed, and students are encouraged to treat each question as if it were an item on a high-stakes exam by remaining quiet throughout the first round of questions. After the questions are presented, a handout is distributed providing the multiple-choice questions, open-ended questions, and objectives to help guide student discussion. Students are then encouraged to discuss in small teams of their own creation for approximately 25 min. The composition of the teams is not formalized, and students tend to break into groups of two to five for discussions.

Following the discussion period, the questions are presented to the students a second time. This time, the correct answers and the percentage of students who selected each choice are revealed. Immediately following is a slide that displays the number of correct responses from before and after the discussion session. We used the comparative links feature in TurningPoint program from Turning Technologies (Youngstown, OH) to generate this comparison.

At the end of the semester, we correlated student performance on summative examinations to attendance at the session and did a statistical analysis using Microsoft Excel.

Results

Figure 1 shows a before and after comparison of a typical question in one of our sessions. We note that the design of a question can either stimulate constructive discussion or provide a distraction to the session. We aimed to present questions deemed difficult by the students (between 30 and 50 % correct responses on the first attempt) to provide a sufficient base to allow efficient peer-to-peer knowledge transfer.

Fig. 1
figure 1

Comparison of the first and second polls of a typical multiple-choice question presented during the Quiz Discuss Compare session. Correct answer is marked with a smiley face

If a question is too difficult, there is a lack of knowledge base to allow idea transfer and the result is frustration among the students. On the other hand, a seemingly very difficult question may provide valuable feedback to course directors related to how their students are thinking about a particular aspect of the course. Conversely, if a question is too easy, students may become distracted and view that portion of the session as a waste of time.

We asked two questions regarding the utility of the QDC session in a medical basic sciences course:

  • First, did this QDC session help students perform better on their exams?

  • Second, did this QDC session benefit students who are at risk more than higher-performing students?

We demonstrate that students who attend this type of voluntary session perform better on a high-stakes exam. On average, students who attended the pre-midterm QDC session scored 3.2 % points higher on our midterm summative exam. Another QDC session was prepared and offered as a voluntary session prior to the final exam. Students who attended this pre-final QDC session scored 2.8 % higher on the final summative exam. We note that these two cohorts are not identical, as different groups of students chose to attend the different sessions. The difference in score is important and relevant to both our students and our course for three reasons. First, an analysis of variance (ANOVA) was calculated using Excel to demonstrate that the mean difference in score distribution between these populations is statistically significant. We report a low probability (p < 0.001 for both sessions) that the score distribution occurred by chance alone (Table 1). One reason why we are able to generate this high predictive value is that we have a large class size at St. George’s University, with a typical enrollment exceeding 500 students.

Table 1 Summary of students’ performance on midterm exam (MT) and performance on the final exam (final) separated by attendance at the preceding voluntary Quiz Discuss Compare session

Second, we must put the seemingly small 3.2 % difference into the context of the class mean average (approximately 80 %) and the fact that all students score greater than 50 % on the exam. These observed ∼3 % score increases become more meaningful when based between the ranges of actual student scores (upper 50 %), and that the majority of the students score above the passing range (above 70 %).

Third, we show a general trend towards a higher letter grade based on attendance at these sessions (Fig. 2). This graph shows that most students benefit from this type of session compared to students who elected not to attend. Additionally, more students who are considered at risk are helped in comparison to higher-scoring students. These results are consistent with previous studies suggesting that team-based learning (TBL) or think-pair-share sessions helped lower performers the most [1, 5]. A reason for this could be because students are provided feedback on their preparedness for an exam and they are given the opportunity to develop higher reasoning skills [6]. Another reason for this observation could be that lower-performing students are able to discuss difficult concepts with the stronger students, an interaction which might not occur outside of the classroom.

Fig. 2
figure 2

Percent of students in each letter grade category based on attendance at the QDC session prior to the midterm exam and the final exam

We argue that it is the attendance, participation, and discussion that occur at the session that helps the student to achieve a better score on a summative exam. We come to this conclusion because we provided a handout of all questions and materials presented in the session for all students, so an unfair advantage was not given to attending students. We conclude that it is not access to the information that counts; it is what is done with the information.

We asked if the summative exam score differences that we saw after participation in our QDC sessions was created by a student self-selection bias. In other words, better students are more likely to attend a volunteer session, so we would expect better exam outcomes based on this cohort assignment. To answer this question, we looked at these same students from their previous term, and their performance on the Medical Biochemistry midterm and final summative exams. We saw that there was no correlation when we compared the cohorts of students who self-selected for attendance at our sessions to their scores on summative exams from the previous term. We saw a difference of 0.7 % points for the midterm assessment and a 1.1 % difference for the final exam. Importantly, the predictive value is lost in this comparison (p = 0.38 and p = 0.37; Table 2) suggesting that the observed difference does not have correlative value.

Table 2 Summary of students’ performance on the Medical Biochemistry midterm exam (BCHM MT) and the final exam (BCHM Final) separated by attendance at the voluntary QDC session in the Medical Genetics course

Discussion

Over the last few decades, the Liaison Committee for Medical Education (LCME) has pushed for medical schools to reduce the amount of passive lecture hours and increase active sessions in the curriculum. These sessions should be structured to facilitate active learning, create a student-centered learning environment, and promote discussion among students. To accomplish this, many innovations have been introduced (or are seeing increased use) in medical school curricula. These include small groups, team-based sessions, case-based sessions, problem-based sessions, think-pair-share sessions, and buzz groups. Further developments include the flipped classroom model, where material is presented before meeting with the students, and instead of a formal lecture, the presenter in the course acts as a facilitator to create an enhancement session. The QDC session described in this paper may be thought of as an improvement on the informal TPS and buzz groups as the use of clickers provides instant and comparative feedback to the students. Students feel a strong engagement in the session as they see immediate feedback, and they have the opportunity to see improvement directly in the session.

Multiple studies suggest that adult learners benefit from active and engaged learning platforms by demonstrating increased retention and better analytic skills when compared to passive formal lecturing [7, 8]. The QDC session we describe shares these advantages. Since QDC promotes discussion among groups of students, it is classified as an active teaching modality. Also, as a result of the discussion time during the session, teams of students typically come to a consensus which reinforces the value of team-based decision making [9, 10]. Learning to function as a team player is a crucial skill to develop for students entering the medical field.

We plan to modify our QDC session to fit into the flipped classroom model. To accomplish this, we will provide clinical cases to the students a few days before meeting with the students. The QDC session will then be used to provide questions and discussions based on the cases presented beforehand. Another potential innovation that we will endeavor to create is to provide a different, but related, set of questions after the discussion session. The aim will be an attempt to measure the application of knowledge from the discussion session.

We received an overwhelming amount of positive feedback from individual students, groups of students, and the Student Government Association. The appreciation from these students that did attend was not lost on us, and it made running the course more enjoyable. Although impossible to quantify, we believe that these sessions help our medical students to maintain and even increase excitement for the material that we present in our course. Based on the analysis presented in this paper and the positive feedback from our students, we have formally scheduled four of these QDC sessions into our course module.