Abstract
Objective
Pharmacy calculations is a course that can be challenging and is often associated with student anxiety about assessments and grades. This study was conducted to determine if student anxiety would be reduced in pharmacy calculations using self-paced, multiple-attempt assessments.
Materials and methods
Self-paced, multiple-attempt assessments were presented to students as graded practice modules and as examinations. Pre-post surveys were used to measure student anxiety in pharmacy calculations. Module performance indices and exam performance indices were correlated with course grade outcomes.
Results
Fifty-four students participated in pre-surveys and forty-eight students participated in post-surveys. Westside Test Anxiety Scale survey results indicated use of self-paced, multiple-attempt assessments reduced students’ perceived anxiety about pharmacy calculations. Student comments about assessment methods were predominately positive. Course grades strongly correlated with module mean scores and high scores, and strongly correlated with mean exam scores. A negative correlation between course grades and belated module attempts alluded to harmful association between student procrastination and course performance in weaker students.
Conclusions
Self-paced, multiple-attempt assessment was associated with improved perception of student test anxiety about pharmacy calculations. With care taken to limit student procrastination, use of these types of assessment could be an effective means of improving student comfort while promoting mastery of the subject.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Student anxiety about performance and attainment of desired letter grades is a problematic issue in higher education. Pascoe and colleagues [1] reviewed Organization for Economic Cooperation and Development survey data that revealed two-thirds of students experienced stress regarding grade outcomes [2]. Greater than half of the students surveyed indicated that they experience test anxiety, with over one-third of students describing feelings of anxiety even while studying. Shankar and Park reviewed the impact of stress on students and cited various effects of stress on their endocrine and immune functions, behavior, cognition and psychology [3]. Student depression and panic attacks have been identified as examples of stress-induced challenges to students’ mental health [4]. Elevated levels of test anxiety are also linked to procrastination and loss of motivation to engage in academic work [5]. Pharmacy students have reported negative effects of stress on mental health and quality of life [6], as stress is associated with lower grade point average in these students [7].
1.1 Context
Pharmacy calculations is an important course of study in many pharmacy curricula. Student mastery of pharmacy calculations is integral to successful passage of the North American Pharmacist Licensure Examination (NAPLEX). Pharmacy calculations is distinct in its emphasis on application of knowledge rather than memorization of knowledge. Pharmacy calculations assessments often require time-intensive, manually inputted answer entries (at times including work shown) rather than multiple choice recognition of answers.
The departmental grading scale works well when applied to pharmacy courses that feature large numbers of assessment questions during limited time intervals. For instance, students can often comfortably navigate fifty or more question multiple choice exams within an average hour-long exam period. However, completion of pharmacy calculations questions require considerably greater time expenditures (3 to 5 mins) per question; therefore, fewer questions are posed within standard assessment intervals. In addition, pharmacy calculations questions are generally graded in an all-or-nothing fashion. This feature, along with assessments possessing limited question quantities, contributes to anxiety among students. (Anecdotal) reports of anxiety increase as the course progresses. Indices of anxiety such as task and practice avoidance are also observed in students [8]. In some cases, students’ confidence in themselves erode quickly, especially from mid-terms to semester’s end. Because math anxiety can impact achievement in pharmacy calculations [9, 10], the observed anxiety is reminiscent of reports that math anxiety has significant and negative associations with performance outcomes [11,12,13].
Given the constraints associated with question number, time, and grading scales, the incorporation of assessment rigor is a challenging proposition for pharmacy calculations instructors. In the absence of rigor, conditions are less than optimal for students to reach full development, be it academic or personal in nature [14]. Faculty must balance the desire to challenge students with an awareness of the pressure students face to perform very well within narrow grading constructs. How can students be motivated to shift their perspectives about how and why they learn pharmacy calculations in a high-stakes, high-pressure professional school environment? Considering grading scale constraints, how can fears regarding the course be diminished without resorting to grade inflation? Multiple-attempt testing strategies are a form of retrieval practice, which has been observed to promote long term retention of information [15] and to promote educational outcome attainment in weaker-prepared students [16, 17]. Retrieval practice is also effective in promoting stress-resistant memory, which could inhibit test anxiety [18]. Accordingly, retrieval practice has been observed to reduce test anxiety in students [19,20,21], including use of applications such as ungraded quizzes [22].
Use of multiple-attempt testing has proven useful in allied health disciplines such as medicine [23,24,25] nursing [26,27,28] and pharmacy [29]. The strategy is often associated with self-paced assessment, which has been featured in pharmacy calculations [30,31,32] and math courses [33, 34]. The objective of this investigation was to determine whether self-paced, multiple-attempt assessments would have a positive impact on students’ perception of anxiety in pharmacy calculations. The association of assessment activities with final grade outcomes were also evaluated. It was hypothesized that use of these assessment strategies would lower student perceptions of anxiety about pharmacy calculations.
2 Materials and methods
An overview of the study’s design is presented in Fig. 1.
The investigation took place in the United States. The study featured quasi-experimental research methodology and pre-post surveys and was exempted by the Texas Southern University Institutional Review Board (Federal Wide Assurance number: FWA00003570). The study protocol was exempted in accordance with United States federal regulations 45CFR 46.104(d)(1) and 45CFR 46. 104(d)(2), pertaining to impacts on student learning and interactions limited to educational testing, respectively. Informed consent was obtained from all participants in the study.
The study featured correlational research methodology through association of module assessment attempt data with the outcomes of final exam score and course grade. Sixty professional pharmacy program students were enrolled in pharmacy calculations. All students enrolled were first-year pharmacy students. All data collected during the study was sourced from this group of students. The study population was 67–33% female to male and was approximately 63% Black, 25% Asian, 10% Hispanic and 2% Caucasian. The pharmacy calculations course was three credit hours and lasted one semester. The course consisted of the following topic areas that are encountered to varying degrees in other courses in the pharmacy curriculum:
-
Prescription interpretation, metric system and unit conversions
-
Dose calculations
-
Reducing and enlarging formulas
-
Specific gravity
-
Expressions of strength
-
Dilution and concentration
-
Infusions and IV admixtures
-
Chemical equivalency
2.1 Course structure and grading scheme
The pharmacy calculations course consisted of in-person lecture intervals for each topic, which was reinforced by in-person problem solving sessions. Lectures met for fifty minutes, three days per week, over sixteen weeks. Problem solving sessions took place over three hours, one day per week, over fourteen weeks. Formal assessments included seven examinations (eight questions each) and a final examination (thirty questions). All assessments were conducted online using Blackboard Learning Management System (LMS)® (Blackboard Learn 9.1 version 3900.28). The course also included practice assessments in the form of online modules (ten questions each) on the Blackboard platform. An increased number of questions were selected for course modules because the modules were untimed and fully self-paced.
Due to the increased emphasis on self-pacing and multiple-attempt assessment, the grade weighting used in this study differed from previous course offerings. Previous offerings of the course consisted of 8% graded practice, 67% quizzes and exams, and a final exam accounting for 25%. For this study, the weighting was shifted to 31% self-paced graded practice, 13% self-paced exams, and a final exam accounting for 56% of the final grade.
2.2 Assessments
Students were informed about self-pacing and multiple-attempt opportunities in the course syllabus. Students were also informed about these details during verbal reviews of course policies and procedures. Pharmacy calculations practice modules largely consisted of retired exam questions drawn randomly from question banks pertaining to each topic area in the course. The modules were accessed using Blackboard LMS and required entry of numerical answers. The assessments were randomized both in terms of question order and numerical answers. Modules could be attempted in unlimited fashion until maximum scores were attained for each topic. Students were provided answers to each module question along with automated feedback about each question upon submission of the module attempt. Feedback was topic-specific and provided detailed instructions to obtain problem solutions. Use of feedback has been demonstrated as a useful tool for self-assessment and concept reinforcement in the health sciences [32, 35]. Students were allowed to use text, lecture and online resources to assist them as they attempted modules. Each successive module could be accessed when students achieved a 69.5% score threshold that was programmed into the Blackboard Learn 9.1 system. The progression threshold was set as such because 69.5% is the traditional pass/fail line in the department. The modules were untimed, and attempts could be paused and resumed at students’ leisure. All modules could be accessed twenty-four hours a day from a few days prior to the first lecture in the course until four days following the comprehensive final exam assessment.
With the exception of the final exam, unlimited attempts were allowed on formal exams for concept reinforcement and to provide an opportunity to achieve maximum assessment scores. Use of repeated examination has been observed to benefit lower-performing students in pharmacy calculations [36], and increased exam number has been observed to reduce feelings of anxiety in pharmacy students. In contrast to the module assessments, exams were timed (sixty minutes) to better simulate a conventional exam environment. Formal exams were also distinct from modules in that they possessed consistent question stems on each attempt. Blackboard LMS enabled use of algorithms to randomize question order and numerical answer values. Successive exams could be accessed upon achievement of a score threshold (69.5%) on each prior assessment in sequence. All exams were available for use until four days following the final exam assessment. The comprehensive final exam was given at the end of the course and was conducted using traditional in-person proctoring over a two-hour time span. In contrast to all other assessments, the final exam was single-attempt.
Assessment data was collected using Blackboard Learn LMS. Module and exam indices of number of attempts, number of attempts within fourteen days of submission deadline, average high score per module topic, average low score per module topic, and average scores per module topic were included. Assessment indices were correlated with corresponding course grades.
2.3 Anxiety surveys
Pre-post surveys were distributed to assess student test anxiety in pharmacy calculations felt during the first week of the course and then during the week prior to the final exam. The surveys were disseminated to students with the caveat that its statements were specific to their experiences in pharmacy calculations. The Likert surveys consisted of the Westside Test Anxiety Scale, a validated tool used to measure test anxiety [37]. The following are the Westside Anxiety Scale Likert interpretations:
-
1.0—1.9 Comfortably low test anxiety
-
2.0—2.5 Normal or average test anxiety
-
2.5—2.9 High normal test anxiety
-
3.0—3.4 Moderately high (some items rated 4 = high)
-
3.5—3.9 High test anxiety (half or more of the items rated 4 = high)
-
4.0—5.0 Extremely high anxiety (items rated 4 = high and 5 = extreme)
Students were informed that survey participation was voluntary, and that survey participation would have no effect on course grades. Students were also informed that all survey responses and commentary were de-identified by Blackboard LMS, ensuring that response sources were unknown to the instructor. A link to access the informed consent page and survey was communicated on the Blackboard course announcement page. Both surveys were available for access during five-day intervals. The survey was deactivated and removed from Blackboard at semester’s end.
Student commentary about the use of self-paced, multiple-attempt assessments was encouraged and collected alongside survey feedback. The comments were evaluated to codify recurrent and/or notable statements. Comment themes were then developed that indicated consensus about self-pacing and multiple-attempt strategies. The most salient comments were preserved to represent the consensus.
2.4 Assessment data collection and analysis
The Respondus Lockdown Browser® (Version 2.0.7.06, Respondus, Inc.) was used to secure and distribute all Blackboard assessment and survey items. Microsoft Excel® (Excel 2019 for Windows, Microsoft Corporation) software and XLSTAT® (Version 2021.3.1, Addinsoft Inc.) software were used to perform statistical analyses. Summary statistics were calculated, and a multiple regression model was used to evaluate assessment indices and course grades. Data normality was determined using the Shapiro-Wilk test, and Pearson correlation coefficients were calculated among data indices. Median and modes for pre-post surveys were also calculated. In addition, Mann–Whitney U tests were used to determine differences in pre-post survey item responses. Due to de-identification of survey data, two-tailed unpaired t tests were used to determine differences between pre-post survey Likert scale composite responses. A threshold of α = 0.05 was selected for tests of significance.
3 Results
Westside Test Anxiety Scale survey response statistics are presented in Table 1, which correspond to the pre- and post-survey disseminations, respectively. Fifty-four students participated in the pre-survey, whereas forty-eight students participated in the post-survey. Anxiety responses indicated reductions in feelings of extreme anxiety for all 10 statements. Combined percent responses pertaining to high-to-extreme anxiety (ie., selections 4 and 5) decreased for all statements between surveys. In contrast, the combined percent responses referencing low anxiety (ie., selections 1 and 2) increased for all statements except for Statement 10, which pertained to anxiety associated with writing assignments.
The medians and mode of responses to pre-post survey statements were compared (Table 1). Selection medians decreased between surveys for Statements 3, 5, 7 and 9, whereas selection modes decreased for Statements 1, 3, 4, 5, 6 and 9. Only one increase in an anxiety measure—the mode for Statement 10—occurred during the post-survey.
Statistically significant (p ≤ 0.05) changes in Likert question responses for the following statements were observed in the post-survey.
-
Statement 3: During important exams, I think that I am doing awful or that I may fail.
-
Statement 9: After an exam, I worry about whether I did well enough.
In addition, a statistically significant difference (p < 0.0001) between pre-post survey composite Likert scale score was observed.
Shapiro–Wilk test results revealed normal distributions for module and assessment data. Significant correlations were noted between course grade and several exam indices (Table 2), including a weak negative association with number of attempts made during the final two weeks of module availability. Weak and moderate positive associations were observed between course grade and exam low score and exam high score, respectively. A strong association was observed between course grade and mean exam score. In reference to practice modules, course grade was moderately and significantly associated with module low score, while module mean score and module high score were strongly and significantly associated with course grade (Table 2).
The distribution of grade results and their associations with exam and module attempts are displayed in Table 3. The data depicts an association of fewest attempts with both the very highest and very lowest grades in the course.
Representative comments about the use of self-pacing and repetition of practice and assessments are included in Table 4. Thematic analysis yielded four themes: control, comprehension, anxiety and deadlines. Students indicated favorable opinions for use of study interventions due to the advantages of increased outcome control through multiple attempts, improved comprehension of concepts through self-pacing, and reduction of anxiety through self-pacing. Several students communicated a desire for assessment submission deadlines at regular intervals.
4 Discussion
Westside Test Anxiety Scale survey results indicated consistent reduction in student test anxiety following exposure to self-pacing and assessment repetition. Perhaps most notable was the substantial decrease in the percentages of students who selected the “extremely high anxiety” response for most statements. Statements pertaining to negative feelings during exams (Statement 3) and post-exam concerns about performance (Statement 9) were significantly different between surveys and share the common feature of assessing general test anxiety, albeit at distinct moments of the test encounter process. Other survey statements are specific by comparison. Statement 10—the lone statement in which “low anxiety” percentages decreased on the post-survey—referenced writing assignments. This statement was least applicable to course procedures. Pharmacy calculations assessments feature short-answer numerical responses, so there were minimal writing tasks during the semester.
Use of multiple attempts on Blackboard Learn 9.1 LMS created more opportunities for practice and for assessment. Online modules presented an efficient and accessible way for students to reinforce concepts. In addition, attachment of a graded component to these online activities was a positive motivator for students. The inclusion of these activities reduced perceptions of student anxiety about calculations testing. Repeated low-stakes testing has been reported assist students to acclimate to tests and to reduce test anxiety [21, 27, 38, 39].
Greater control of destiny was a common theme identified in student comments (Table 4). Control of factors such as time, environment and grade determination are deemed advantages to self-pacing and multiple attempt assessments in multiple contexts [40,41,42,43]. Several students commented that self-pacing assisted in their comprehension of concepts). Some students were happy about the ability to learn the subject at a manageable pace, which might have proven difficult under a traditional assessment structure. Students also alluded to strengthening concept mastery through repetition. This is consistent with reports of repetitive assessment promoting mastery of concepts [32, 44]. Some students explicitly referenced feelings of reduced anxiety because of assessment interventions. The comments were notable because students had not been prompted to discuss interventions in this context. Previously Tchen and colleagues [45] reported self-pacing to have neutral effects on feelings of anxiety in advanced pharmacy practice experience students. Despite the cited positives of self-pacing, a common critique offered by students was the desire to incorporate structured, regular submission deadlines for assessments rather than a single deadline at the conclusion of the course. Strategies such as implementation of formal quizzes [46] and incorporation of time intervals and progress indicators [47] have been reported to improve student engagement with self-pacing.Moderate and significant correlations observed between course grade and exam indices suggest an association between self-paced exam activity and grade performance. Whereas total exam attempts had very little association with course grades, “last-minute” exam attempts exhibited a significant negative association with grades. Last-minute bursts of exam attempt activity were especially prevalent in students who performed poorly in the course, which might indicate a lack of preparation in students who procrastinated during much of the semester. In contrast, students who opted to complete their exams in pace with lecture topics achieved more successful course grades. Procrastination has been cited as a major drawback to self-paced, limitless scoring opportunities [48].
Strong correlations among module indices (high score and mean score) and course grades suggest that module performance associated with course performance. This result is reminiscent of observations that student homework performance better correlates with course grade outcome than formal exam performance [49], and that unsupervised online assessment enhances exam performance [50]. Module exercises included automatic feedback regarding both correct and incorrect answer submissions. Use of item feedback has been associated with positive reinforcement of memory during task learning [51].
The results allude to the notion of better prepared, higher- performing students requiring less assessment repetition to achieve desired grade outcomes, and some lesser-prepared students making fewer assessment attempts than what might have been optimal for success. In the case of students who earned “F” grades in the course, some students refrained from attempting during the final two weeks of the course altogether. It is possible that the observed performance trends among strong and weaker students had a negating effect on correlations between attempt numbers and grades. Weak correlation of unlimited attempts with assessment outcomes [52], and better assessment performance with fewer attempts [53] have also been reported by investigators. Poor performance has been observed despite use of unlimited-attempt learning and assessment strategies [54], including lack of significant improvement in knowledge retention of pharmacy calculations with repeated testing despite student perceptions of successful retention [36].
5 Limitations
There were limitations to the study. A single cohort was exposed to self-pacing and repeat testing. Partitioning the class into two cohorts—a group encouraged to self-pace and a group receiving traditional assessment—would have been an intriguing design.
The modules consisted of questions randomly drawn from test banks. Some students expressed frustrations about encountering new questions with each module attempt. Use of curated, restricted lists of module questions could prove effective in encouraging module engagement in the future. In addition, module assessments were un-proctored. Although this decision was rooted in the belief that academic dishonesty would likely result in poor performance on the final exam, its weakness is acknowledged.
Procrastination was a deterrent to student success. Despite students’ warm receptions to self-pacing, the desire for assessment submission deadlines warrants attention. Students were allowed access to the modules for a brief interval after the conclusion of the final exam. Although this access was allowed to provide students with more time to obtain maximum module scores, the gesture was likely detrimental to the worst procrastinators. Imposition of an access deadline at various intervals during the semester and/or prior to the final exam might have better prepared procrastinators for the final exam. Because the final exam accounted for a substantial percentage of students’ course grades, a higher-than-usual proportion of students failed the course. Better measures will be taken to limit procrastination in the future. Inhibition of procrastination might be integral to decreasing failure rate in the course.
6 Conclusions
This investigation featured use of self-paced, multiple-attempt assessments to reduce student anxiety about pharmacy calculations. Use of multiple-attempt exams and modules contributed to statistically significant reduction in students’ perceived anxiety over the course of a semester. Although some association was observed among indices of assessment performance and course grades, procrastination in completing self-paced assessments had a negative impact on grade performance in weaker students. Refinement of assessment and grading measures is warranted to promote improvement not only in student comfort and reception, but in student performance in pharmacy calculations as well.
Data availability
The datasets generated during and/or analyzed during this study are not publicly available to protect the confidentiality of study participants but are available in de-identified form from the corresponding author on reasonable request.
Code availability
Not applicable.
References
Pascoe MC, Hetrick SE, Parker AG. The impact of stress on students in secondary school and higher education. Int J Adolesc Youth. 2020;25(1):104–12. https://doi.org/10.1080/02673843.2019.1596823.
Pena-López I. PISA 2015 results (volume I). Excellence Equity Education. 2016. https://doi.org/10.1787/9789264266490-en.
Shankar NL, Park CL. Effects of stress on students’ physical and mental health and academic success. Int J Sch Educ Psychol. 2016;4(1):5–9. https://doi.org/10.1080/21683603.2016.1130532.
Eisenberg D, Hunt J, Speer N, Zivin K. Mental health service utilization among college students in the United States. J Nerv Ment Dis. 2011;199(5):301–8. https://doi.org/10.1097/NMD.0b013e3182175123.
Mavilidi MF, Hoogerheide V, Paas F. A quick and easy strategy to reduce test anxiety and enhance test performance. Appl Cogn Psychol. 2014;28(5):720–6. https://doi.org/10.1002/acp.3058.
Hirsch JD, Nemlekar P, Phuong P, Hollenbach KA, Lee KC, Adler DS, Morello CM. Patterns of stress, coping and health-related quality of life in doctor of pharmacy students. Am J Pharma Educ. 2020. https://doi.org/10.5688/ajpe7547.
Votta RJ, Benau EM. Predictors of stress in doctor of pharmacy students: results from a nationwide survey. Curr Pharm Teach Learn. 2013;5(5):365–72. https://doi.org/10.1016/j.cptl.2013.06.014.
Choe KW, Jenifer JB, Rozek CS, Berman MG, Beilock SL. Calculated avoidance: math anxiety predicts math avoidance in effort-based decision-making. Sci Adv. 2019;5(11):1062. https://doi.org/10.1126/sciadv.aay1062.
Khasawneh E, Gosling C, Williams B. The correlation between mathematics anxiety, numerical ability and drug calculation ability of paramedic students: an explanatory mixed method study. Adv Med Educ Pract. 2020;11:869. https://doi.org/10.2147/AMEP.S258223.
Osahor K, Woodend K, Mackie J. The relationship between math personality, math anxiety, test preparation strategy and medication dose calculations in first year nursing students. J Nursing Educ Pract. 2019. https://doi.org/10.5430/jnep.v9n8p80.
Barroso C, Ganley CM, McGraw AL, Geer EA, Hart SA, Daucourt MC. A meta-analysis of the relation between math anxiety and math achievement. Psychol Bull. 2021;147(2):134. https://doi.org/10.1037/bul0000307.
Zhang J, Zhao N, Kong QP. The relationship between math anxiety and math performance: a meta-analytic investigation. Front Psychol. 2019;10:1613. https://doi.org/10.3389/fpsyg.2019.01613.
Mutlu Y. Math anxiety in students with and without math learning difficulties. Int Elec J Elem Educ. 2019;11(5):471–5. https://doi.org/10.2682/iejee.2019553343.
Schnee E. “In the real world no one drops their standards for you”: academic rigor in a college worker education program. Equity Excell Educ. 2008;41(1):62–80. https://doi.org/10.1080/10665680701764502.
Roediger HL III, Karpicke JD. Test-enhanced learning: taking memory tests improves long-term retention. Psychol Sci. 2006;17(3):249–55.
Agarwal PK, Finley JR, Rose NS, Roediger HL III. Benefits from retrieval practice are greater for students with lower working memory capacity. Memory. 2017;25(6):764–71. https://doi.org/10.1080/09658211.2016.1220579.
Brewer GA, Unsworth N. Individual differences in the effects of retrieval from long-term memory. J Mem Lang. 2012;66(3):407–15. https://doi.org/10.1016/j.jml.2011.12.009.
Smith AM, Floerke VA, Thomas AK. Retrieval practice protects memory against acute stress. Science. 2016;354(6315):1046–8. https://doi.org/10.1126/science.aah5067.
Barsham H. Can retrieval practice of the testing effect increase self-efficacy in tests and reduce test anxiety in 10-to 11-Year-Olds? Doctoral dissertation, University of Cambridge. 2021. https://doi.org/10.1786/CAM.77643
Pastötter B, Bäuml KH. Retrieval practice enhances new learning: the forward effect of testing. Front Psychol. 2014. https://doi.org/10.3389/fpsyg.2014.00286.
Agarwal PK, D’Antonio L, Roediger HL III, McDermott KB, McDaniel MA. Classroom-based programs of retrieval practice reduce middle school and high school students’ test anxiety. J Appl Res Mem Cogn. 2014;3(3):131–9. https://doi.org/10.1016/j.jarmac.2014.07.002.
Khanna MM. Ungraded pop quizzes: test-enhanced learning without all the anxiety. Teach Psychol. 2015;42(2):174–8. https://doi.org/10.1177/0098628315573144.
Baghdady M, Carnahan H, Lam EW, Woods NN. Test-enhanced learning and its effect on comprehension and diagnostic accuracy. Med Educ. 2014;48(2):181–8. https://doi.org/10.1111/medu.12302.
Larsen DP, Butler AC, Roediger HL III. Comparative effects of test-enhanced learning and self-explanation on long-term retention. Med Educ. 2013;47(7):674–82. https://doi.org/10.1111/medu.12141.
Dobson JL. Effect of uniform versus expanding retrieval practice on the recall of physiology information. Adv Physiol Educ. 2012;36(1):6–12. https://doi.org/10.1152/advan.00090.2011.
Hughes M, Salamonson Y, Metcalfe L. Student engagement using multiple-attempt ‘Weekly Participation Task’quizzes with undergraduate nursing students. Nurse Educ Prac. 2020. https://doi.org/10.1016/j.nepr.2020.102803.
Messineo L, Gentile M, Allegra M. Test-enhanced learning: analysis of an experience with undergraduate nursing students. BMC Med Educ. 2015;15(1):1–7. https://doi.org/10.1186/s12909-015-0464.
Cunningham H, Roche J. Using Web CT to determine competency in medication dosage calculation for nursing students. Nurse Educ. 2001;26(4):164–6.
Terenyi J, Anksorus H, Persky AM. Impact of spacing of practice on learning brand name and generic drugs. Am J Pharm Educ. 2018. https://doi.org/10.5688/ajpe6179.
Bell EC. Individual vs. team based readiness assurance testing in pharmacy calculations. Int J Inst. 2022. https://doi.org/10.29333/iji.2022.15142a.
Bell EC, Fike DS, Liang D, Lockman PR, McCall KL. Assessment of computer-mediated module intervention in a pharmacy calculations course. Educ Inf Technol. 2017;22(5):2013–25. https://doi.org/10.1007/s10639-016-9531-8.
Lacroix M, McCall KL III, Fike DS. The keller personalized system of instruction in a pharmacy calculations course: a randomized trial. Curr Pharm Teach Learn. 2014;6(3):348–52. https://doi.org/10.1016/j.cptl.2014.02.002.
Weng P. Developmental math, flipped and self-paced. Primus. 2015;25(9–10):768–81. https://doi.org/10.1080/10511970.2015.1031297.
Bourne DW, Davison AM. A self-paced course in pharmaceutical mathematics using web-based databases. Am J Pharm Educ. 2006. https://doi.org/10.5688/aj7005116.
Badyal DK, Bala S, Singh T, Gulrez G. Impact of immediate feedback on the learning of medical students in pharmacology. J Adv Med Educ Profess. 2019. https://doi.org/10.30476/JAMP.2019.41036.
Coker AO, Lusk KA, Maize DF, Ramsinghani S, Tabor RA, Yablonski EA, Zertuche A. The effect of repeated testing of pharmacy calculations and drug knowledge to improve knowledge retention in pharmacy students. Curr Pharm Teach Learn. 2018;10(12):1609–15. https://doi.org/10.1016/j.cptl.2018.08.019.
Driscoll R. Westside test anxiety scale validation. Online submission, Institute of Education Sciences. 2007. https://files.eric.ed.gov/fulltext/ED495968.pdf.
Nyroos M, Schéle I, Wiklund-Hörnqvist C. Implementing test enhanced learning: swedish teacher students’ perception of quizzing. Int J Higher Educ. 2016;5(4):1–2. https://doi.org/10.5430/ijhe.v5n4p1.
Szpunar KK, Khan NY, Schacter DL. Interpolated memory tests reduce mind wandering and improve learning of online lectures. Proc Natl Acad Sci. 2013;110(16):6313–7. https://doi.org/10.1073/pnas.1221764110.
Mayer RE, Wells A, Parong J, Howarth JT. Learner control of the pacing of an online slideshow lesson: does segmenting help? Appl Cogn Psychol. 2019;33(5):930–5. https://doi.org/10.1002/acp.3560.
Fiel J, Lawless KA, Brown SW. Timing matters: approaches for measuring and visualizing behaviours of timing and spacing of work in self-paced online teacher professional development courses. J Learn Anal. 2018;5(1):25–40. https://doi.org/10.1860/jla.2018.51.3.
Lim JM. The relationship between successful completion and sequential movement in self-paced distance courses. Int Rev Res Open Dist Learn. 2016;17(1):159–79. https://doi.org/10.19173/irrodl.v17i1.2167.
Cho MH, Heron ML. Self-regulated learning: the role of motivation, emotion, and use of learning strategies in students’ learning experiences in a self-paced online mathematics course. Distance Educ. 2015;36(1):80–99. https://doi.org/10.1080/01587919.2015.1019963.
Davis MC, Duryee LA, Schilling AH, Loar EA, Hammond HG. Examining the impact of multiple practice quiz attempts on student exam performance. J Educ Online. 2020;17(2): n2.
Tchen P, Leung L, Simpson F, Kim-Sing A, Pearson ML. Bridging the gap: an evaluation of self-paced online transition modules for advanced pharmacy practice experience students. Curr Pharm Teach Learn. 2018;10(10):1375–83. https://doi.org/10.1016/j.cptl.2018.07.006.
Chiampas TD, Kassamali Z, Justo JA, Danziger LH. Opportunities and challenges in converting a pharmacy curriculum elective course from a live to an online teaching environment. Pharm Educ. 2018;18:50–3.
Zhu M. Enhancing MOOC learners’ skills for self-directed learning. Distance Educ. 2021;42(3):441–60. https://doi.org/10.1080/01587919.2021.1956302.
Purao S, Sein M, Nilsen H, Larsen EÅ. Setting the pace: experiments with keller’s PSI. IEEE Trans Educ. 2016;60(2):97–104. https://doi.org/10.1109/TE.2016.2588460.
Verleger MA. Just five more minutes: the relationship between timed and untimed performance on an introductory programming exam. ASEE Ann Conf Exp. 2016. https://doi.org/10.18260/p.25510.
McDaniel MA, Wildman KM, Anderson JL. Using quizzes to enhance summative-assessment performance in a web-based class: an experimental study. J Appl Res Mem Cogn. 2012;1(1):18–26. https://doi.org/10.1016/j.jarmac.2011.10.001.
Attali Y. Effects of multiple-try feedback and question type during mathematics problem solving on performance in similar problems. Comput Educ. 2015;1(86):260–7. https://doi.org/10.1016/j.compedu.2015.08.011.
MacKenzie LM. Improving learning outcomes: unlimited vs limited attempts and time for supplemental interactive online learning activities. J Curr Teach. 2019;8(4):36–45. https://doi.org/10.5430/jct.v8n4p36.
Orchard RK. Multiple attempts for online assessments in an operations management course: an exploration. J Educ Bus. 2016;91(8):427–33. https://doi.org/10.1080/08832323.2016.1256262.
Eyre HL, Parks K, Crone-Todd DE. Student unit test persistence in a mastery-based general psychology course. Poster presented at the annual meeting of the Southeastern Association for Behavior Analysis, Greenville, SC. 2006.
Funding
Not applicable.
Author information
Authors and Affiliations
Contributions
Study conception, data collection, and data analysis were performed independently by the corresponding author. The author read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This investigation’s protocol was exempted by the Texas Southern University Institutional Review Board (Federal Wide Assurance number: FWA00003570) in accordance with United States federal regulations 45CFR 46.104(d)(1) and 45CFR 46. 104(d)(2):
-
Research, conducted in established or commonly accepted educational settings, that specifically involves normal educational practices that are not likely to adversely impact students’ opportunity to learn required educational content or the assessment of educators who provide instruction.
-
Research that only includes interactions involving educational tests (cognitive, diagnostic, aptitude, achievement), survey procedures, interview procedures, or observation of public behavior (including visual and auditory recording).
Competing interests
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Bell, E.C. Self-pacing and multiple-attempt assessment to address student anxiety in pharmacy calculations. Discov Educ 2, 9 (2023). https://doi.org/10.1007/s44217-023-00032-3
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s44217-023-00032-3



