Journal of Formative Design in Learning

, Volume 2, Issue 1, pp 20–35 | Cite as

Using Instructional Design to Support Community Engagement in Clinical and Translational Research: a Design and Development Case

  • Natercia Valle
  • Janet Brishke
  • Albert D. Ritzhaupt
  • Christy Evans
  • David R. Nelson
  • Elizabeth Shenkman


While community stakeholder engagement is becoming increasingly common in health care, operationalized training materials to support this learner population (community members) are scarce. Instructional design principles were used to create an Open Educational Resource (OER) to support the involvement of community stakeholders in health care research at a university health science center. Prior to the development of this project, a formal group, whose members named themselves Citizen Scientists (CSs), already existed to offer lay perspective on clinical and translational research studies. These CSs are involved in a wide range of active committees within the university’s college of medicine. The challenge of this program, however, is that the CSs require training to engage in these activities (e.g., reviewing research proposals). This design and development research case outlines the instructional design processes, and formative evaluation methods and results of the creation of an OER. While the description of the instructional design processes can be useful for similar project implementations, information on methods and results from the formative evaluation add the following benefits: (a) help community stakeholders to analyze whether projects’ goals have been met, (b) present project aspects that could be improved, and (c) support other communities by creating a model for project evaluation based on similar contexts and with similar project goals.


Instructional design Citizen Scientist Clinical and translational research Health care Open educational resource Formative evaluation 

Citizen Scientist Program

Increasingly, stakeholder engagement from community members is recognized as essential for the conduct of more meaningful research that can ultimately lead to the more rapid uptake and use of research findings in diverse health-related settings. The University of Florida (UF) Clinical and Translational Science Institute (CTSI) employs a team of Citizen Scientists (CS) to offer a lay perspective on active and proposed clinical and translational research studies. Although CSs have had an important role in fields such as archeology, astronomy, and natural history (Silvertown 2009), such an approach is still relatively new in the domain of clinical research (Domecq et al. 2014). Therefore, the availability of instructional resources specifically designed for this learner population are scarce, making the formal training (Morrison et al. 2010) of new members a challenge for institutions seeking to establish CS programs at their sites.

The task of contributing to the generation of research ideas and actively participating in research projects may sound straightforward but, in reality, it requires special training and knowledge about the following areas: (a) the impact CSs can have as community members; (b) the precautions that research groups must take in relation to how they conduct the research and how they recruit, inform, and treat research participants ethically; (c) the rights of research participants; (d) how research studies are designed, proposed, and funded; (e) how research makes its way from the clinical setting to the community through translational research; (f) the value of multiple stakeholders in the research process; (g) how culture plays a role in health care; (h) how biomedical data can be used to improve health care outcomes; and (i) how to understand and speak the language of research.

Design and Development Research Approach

Given the many aforementioned topics that CSs have to learn for successful participation in clinical and translational research, the problem required a flexible solution that enabled the CSs to learn the materials at different paces and times. Further, members of the UF CS program could be added on an ongoing basis, which creates a logistical problem for training new members: integrating introductory-level content into existing CS activities is repetitive and time-consuming for veteran CSs, and can be overwhelming for new CSs. Therefore, a technology-enhanced, self-paced, and instructional system was a necessary solution to address this problem. The solution needed to be flexible to accommodate the many use cases of the curriculum to support the face-to-face engagement among the other CSs and researchers. This article provides a design and development research case which is defined as a study focused on a product’s design, development, and formative evaluation to provide contextually rich lessons learned and illustrates the process related to the creation of technology-enhanced instructional materials to support the training of CSs (Richey et al. 2004). More specifically, this research case is representative of the Product and Tool Research category that addresses each phase of the instructional design project as well as the rationale for its design and development, and the tools used (Richey and Klein 2007).

Research Questions and Purpose

The research questions that guided this study were based on content, learning, and curriculum validity (Reeves 2011): (a) Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?, and (b) Does the format of the learning materials used in the CS curriculum deliver the content effectively?

Based on a mixed method research approach (Johnson and Onwuegbuzie 2004) that used quantitative and qualitative data to analyze the context, learners, and quality of the learning materials developed, the purpose of this research was to explore how the practical approaches described for this context are effective in addressing the instructional problems identified, thereby contributing to the body of knowledge related to design and development research (Richey and Klein 2007). The main advantage of this design and development-mixed method research case is the complementary types of data offered to account for the instructional problem, project goals, and framework used for the design and development of the technology-enhanced, instructional materials.


The project subscribed to the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) instructional design model (Morrison et al. 2010) for the most part; however, the formative evaluation was performed prior to the final implementation; that is, the development and formative evaluation of each module happened in a cyclical scheme until a satisfactory result was achieved. The Analysis, Design, Development, Evaluation, and Implementation (ADDEI) approach was based on time and budget constraints. Due to the project timeline, if implementation was conducted first, more time and money would be necessary to make the revisions to the video tutorials, assessments, and website, as well as other platforms used to host the learning materials. Implementation was conducted only in the final stage of the project, when all the learning materials had been designed, validated, and formatively evaluated.

The graphic in Fig. 1 outlines the three main phases of the project. Phase 1 was used to initiate (define project goal and identification of key stakeholders), analyze (narrow the context, define the learners and tasks, and identify opportunities and constraints), and design (select approaches to format, style, and execution) the project. Phase 2 represents the most active stage of the project, where learning materials were developed, validated, and formatively evaluated in a cyclical manner, with multiple modules being developed in parallel. Finally, phase 3 was used to implement the learning materials in their final format and close the design project. Although the modules are organized in a sequential order in the final product, they were created according to the availability of content presenters, video editing staff, and development of assessment items. This flexibility was crucial for the completion of the project in a timely and efficient manner.
Fig. 1

ADDIE-to-ADDEI Approach


The analysis stage of the project was implemented through meetings with the program facilitator, Subject Matter Experts (SMEs), CSs, and observations of CSs’ regular meetings. The onsite observations and individual interviews with CSs were particularly useful to guide decisions on which media to use based on their affordances and the nature of the content to be covered.

Three CSs were interviewed using the interview protocol shown in Appendix 1. Two of the interviewees were female and one was male. The CSs included one full-time college student, and two senior citizens. The CSs’ tenure in the program ranged from 8 months to a little over a year. Motivations for joining the program included previous experiences with the health care system, an affinity to the institution, and the desire to be able to assist with research in general. While the CSs interviewed ranged in experience, the interviewees provided a wealth of knowledge pertaining to the challenges of learning about their roles in the CS program. These challenges included the complex web of health care funding sources, types and classifications of research, complex terminology and acronyms used in the health care system, and most importantly, how CSs fit in the process of clinical and translational research.

As for their learning preferences, all interviewees preferred to receive instructional information through video; however, one of them emphasized that there should be a balance between video tutorials and written materials because, in her words, “Video at times is really good, but solely video is like watching Netflix or TV vs. trying to learn.” Based on our learners and contextual analysis (Morrison et al. 2010), a multimedia learning environment seemed to be the optimal choice to meet the project goals and learners’ general preferences. This decision is supported by the Cognitive Theory of Multimedia Learning (CML), which states that the cognitive processes involved with learning can be enhanced when appropriate pictorial and verbal information are integrated with each other and with prior knowledge (Mayer 2005).

An important feature of the context pertaining to this project is the multilevel engagement model followed by this CS program. Figure 2 shows the three possible levels of engagement for CSs at UF as well as possible tasks that they can be involved with as CSs. This model of multilevel engagement is supported by the structure of the curriculum, which can be implemented in either a sequential or non-sequential fashion to meet specific CS instructional needs. The curriculum’s self-paced format was designed to support various levels of involvement, as well as ongoing training needs by other institutions.
Fig. 2

CS Program. Levels of engagement (source: UF CTSI)

The project team was composed of CSs, an educational technology researcher/practitioner, an educational technology graduate student, a public health professional, and health care researchers who aided the identification of SMEs for the design and development of the instructional materials. The SMEs who collaborated on the project were nurses, physicians, a research navigator, a grant specialist, research coordinators, a social worker, and faculty members in the areas of ethics, research, biomedical informatics, and translational science.


The design process started with the definition of learning objectives, which were based on the knowledge that lay people need in order to become well-informed, engaged CSs in clinical and translational research. Learning objectives for each lesson were crafted by project staff and were based on the takeaways necessary to gain a basic understanding of each topic. These learning objectives are designed to prepare the CSs to collaborate in meaningful ways with researchers, and most importantly, empower them to offer critical, insightful reviews on a variety of topics.

The curriculum was developed to support the engagement of community stakeholders in clinical and translational research; however, its development also adhered to the notion of involving multiple stakeholders from the start until the completion of the project. This early stakeholder engagement approach (Schwalbe 2015) was crucial to the instructional design and quality of the final product as CSs provided feedback on, validation, and incremental approval of, the instructional materials.

The design of the curriculum and supporting instructional materials was based on previous instructional practices used by the CTSI staff and veteran CSs, and on the levels of engagement new CSs may develop to become involved in research activities. The existing instructional materials used for in-class training as well as the insights provided by the face-to-face instructors of the program were helpful to the instructional design team to define the scope of the project (Schwalbe 2015), learning objectives, instructional materials, and evaluation instruments (Morrison et al. 2010).

The CS curriculum and supporting instructional materials represent an interface between teaching- and learner-centered approaches (Brown 2003). For example, the web-based instructional materials and activities are based on what new CSs need to learn as they become involved in clinical and translational research, with no built-in individualized assistance; however, the curriculum can be easily modified to integrate the web-based materials into classroom activities, which allows the instructor to use the materials according to individual needs and learning profiles (Brown 2003).

Due to the nature of the topics to be presented and the amount of content, it became necessary to divide the curriculum into several modules based on thematic topics CSs are likely to encounter in research projects (Sweller 1994):
  • Welcome and Orientation: provides a general introduction to the CS curriculum and the role of CSs in research, as well as the importance of maintaining confidentiality

  • Research Ethics: presents information on elements necessary for ethical research including the Institutional Review Board, historical breaches in research ethics, and the informed consent process

  • Sponsored Research: familiarizes new CSs with the concept of clinical research, including how it is designed, funded, and presented in scientific publications

  • Clinical and Translational Science: offers information on types of research studies and the roles CSs can have in clinical and translational research

  • Stakeholder Engagement: describes examples of and reasons for stakeholder engagement, and illustrates how research can benefit from multiple perspectives (e.g., researchers, vendors, patients, and caregivers)

  • Cultural Diversity in Research: addresses cultural competence and how CSs can ensure multiple cultures are acknowledged and respected in research projects

  • Biomedical Informatics: features information on the role of data in transforming health care, including how big data can be used to improve health outcomes

Each of these seven modules is broken down further into four or five lessons or sub-modules, half of which are didactic in nature. Figure 3 provides a screenshot of a didactic lesson from module two. Every didactic lesson is comprised of “thought questions” for learners to consider as they begin to move through the instructional materials. These thought questions serve as a pre-instructional strategy (Gall 1970) to assist the learners with the introduction of the new content (Hartley and Davies 1976). Additionally, each didactic lesson includes clearly written learning objectives (Gronlund 2004) with observable action verbs and appropriate content for the lesson. The didactic lessons also include the short (less than 10 min), video-based lecture covering the learning objectives and practice assessments that include auto-grading and elaboration feedback messages for both correct and incorrect responses.
Fig. 3

Screen shot—CS Curriculum Website. This page is an example of a didactic lesson


The curriculum was designed and implemented in 6 months and required the involvement of external contractors for video production and editing. The project team created an online self-paced instructional system to operationalize the preparation of new CSs. In addition to the local impact, the web-based platform allows an easy customization of the instructional materials to attend to the needs of other CS groups at other institutions. The instructional materials were developed using a variety of information and communication technologies, including WordPress (HTML, JavaScript, CSS, MySQL, and PHP), Adobe Captivate, YouTube, and Adobe Premiere. The full curriculum is housed on a WordPress instance hosted at UF. YouTube was used to deliver the video tutorial content on a public channel by embedding the videos on the WordPress site. Adobe Captivate was used to develop interactive quizzes with elaboration feedback for both correct and incorrect responses, and Adobe Premiere was used for video editing purposes.

To ensure that the content in the CS curriculum was compatible with the project’s goal and learners’ needs, a comprehensive process of creation, revision, validation, and formative evaluation was conducted. SMEs were approached by project staff and offered a description of the project, draft learning objectives for their topic, and parameters such as format, duration, and project timeline requirements. After agreeing to participate, SMEs proceeded in several ways, sometimes utilizing all three approaches: (1) schedule meeting with project manager to further discuss expectations and needs, (2) arrange to present draft content in person to the CS group to ensure content was in lay terms and “digestible” in the allotted time, and (3) set a date to record video presentation.

Once the SME was comfortable with the assignment, a draft PowerPoint slide set, and written transcript of the presentation were created and sent to the project manager. Revisions were made to the presentation materials to ensure the visual materials were consistent across presenters and that the content was written for a lay audience. This meant minimizing the use of jargon and abbreviations, explaining key terminologies prior to their first use, and careful review by the project manager to ensure compliance and consistency. Revision of content prior to recording was necessary, as any reshooting would need to be scheduled among the video studio, SME, and project manager. Given the timeline for this project, reshoots were not encouraged. In fact, only one SME reshot their video lesson after formative evaluation by CSs.

The videos were recorded in a professional recording studio with a “green screen,” and were guided by best practices for creating instructional videos and multimedia learning materials (Clark and Mayer 2016; Swarts 2012). The final videos superimposed the presentation slides in the background, to the left of the SME (see Fig. 3). This ensured a uniform look-and-feel among different topics and presenters. The SMEs presented their video content by reading the transcript of the presentation content from a teleprompter. SMEs were allowed as many takes as necessary to record the video tutorial, and editing of major verbal stumbles or reading miscues occurred in post-production. Edited videos were reviewed by project staff to ensure quality prior to showing the CSs for formative evaluation purposes.

The assessment items and associated feedback messages were created by project staff based on final SME video presentations, PowerPoint slides, and teleprompter scripts. Revisions to the assessments focused on ensuring items were in lay terms and aligned with the lesson’s learning objectives. Ongoing input from the CS program manager was crucial for this component; as experience training, this specific learning population offered vital input on assessment content. Once assessment items were created and revised, they were sent to SMEs for further revision and validation. All assessment items were validated by SMEs prior to the formative evaluation. There was only one instance in which minor changes suggested by the SME could not be implemented prior to the formative evaluation as they were sent a few hours prior to the formative evaluation session. Examination by project staff for accuracy was still conducted. After the formative evaluation, additional revisions to the video tutorials and assessment items were implemented as needed, and in the case of video tutorials, only if absolutely necessary. In their final format, assessment items incorporate a color-coded interactive feedback functionality created in Adobe Captivate to aid CSs in regulating their own learning (Butler and Winne 1995). The use of feedback in computer-based learning environments has been shown to be an effective learning aid when the feedback provides elaboration (Van der Kleij et al. 2015).


Following completion of each video tutorial and assessment, they were presented to the CS group during an in-person formative evaluation session. Not all CSs were present at every formative evaluation session. The samples for each session ranged from five to 11 CSs present during any given time. As with the creation of learning materials, these materials were not presented in any particular order and were offered as they were completed to accommodate the project timeline. The formative evaluation process was divided into six sessions, mirrored the research questions, and gauged CS performance on and perception of the assessments that accompany each video (Appendix 2) and perceptions in relation to the quality of videos (Appendix 3). Table 1 shows which lessons were included in each formative evaluation session.
Table 1

Formative evaluation sessions and respective sub-modules




1.4, 2.4a


1.2, 2.1


2.3, 2.4b, and 6.1


2.2, 3.3, and 4.1


3.2, 4.2, and 5.1


3.1, 7.1, and 7.2

One sub-module, 2.4, received substantial revisions after its first formative evaluation (2.4a) and was granted another formative evaluation (2.4b), which included revisions to question stems, distractors, and a new video tutorial by the same presenter. This was the only video tutorial that was re-filmed. The project team relied on descriptive statistics using Likert scale questions, and open-ended questions to evaluate assessment items and the quality of video tutorials. The data analysis and graphics were conducted in R, a free software environment for statistical computing and graphics.


Once the assessment items and videos achieved the desired quality, they were included with supplemental learning materials (e.g., journal articles, tip sheets, and links to additional resources) on the CS program website. Within the seven modules are 30 lessons, half of which are didactic. As noted, the didactic lessons each include thought questions, learning objectives, a video tutorial, a practice assessment with elaboration feedback, and in some cases, supplemental materials and video clips with tips shared by veteran CSs. The remaining lessons contain additional training information including case study videos, animated videos, and video interviews offering insights from both CSs and researchers who have engaged CSs.

All raw instructional materials created throughout the project are available as an Open Educational Resource (OER), meaning they are available to anyone upon request (Johnstone 2005), which can facilitate customization by other institutions in the implementation of a CS program. Furthermore, the availability of the instructional materials as an OER reflects the institutional goals of supporting socially responsible practices in clinical and translational research (Reeves 2000, 2011), which includes the involvement of CSs and potential engagement with other institutions and stakeholder with an interest in the CS program and curriculum. This research is based on practices that connect the learning, community stakeholder engagement, and clinical research domains, with the increased benefit of outreach impact supported by its OER nature.


Internal consistency reliability was α = .885 for the assessment score and α = .863 for the video score. These high reliability coefficients support the use of composite scores in evaluating the instructional materials.

Research question 1: Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?

The assessment items comprised two types of multiple-choice questions: recognition and knowledge-transfer items. This allowed CSs to keep the information longer in short-term memory, which can contribute to its transference to long-term memory (Sweller, Van Merrienboer, and Paas 1998), facilitating learning. Table 2 provides a sample recognition question and knowledge-transfer question from the “intellectual property and confidentiality” didactic lesson to illustrate the nature of the assessment items.
Table 2

Sample recognition and knowledge-transfer questions

Recognition question

 Which of the following would not be classified as intellectual property?

  a. A researcher’s hypothesis

  b. A researcher’s dataset

  c. A researcher’s idea

  d. A researcher’s last name

Knowledge-transfer question

 Please read the following scenarios and choose the most appropriate response for each question based on what you have learned about intellectual property and confidentiality.

 Dr. Samuel Parker has completed a research study in which he and his team have tested a surgical process that they hypothesized would provide about 75% relief to most people suffering from a certain kind of chronic pain. As a Citizen Scientist, you worked on the grant proposal with Dr. Parker and members of his research team. Because of this, you are included in an e-mail message which states that the preliminary results of the study indicate that this technique does work.

 You are very excited about the results of the study and you forward the e-mail to your sister, who suffers from the same form of chronic pain so that she can get the surgery as quickly as possible.

 Is it acceptable for you to share this information with your sister?

  a. As a Citizen Scientist and a member of the research team, you have the right to share this information.

  b. It is acceptable to share this information with your sister because she is a relative.

  c. All preliminary results are considered intellectual property and should not be disclosed.

  d. Patients in urgent need of this procedure should have knowledge of and access to this information as soon as possible, so you should share it.

Assessment items were revised when any given question had a correct response rate of less than 60% from the formative evaluation data. Figure 4 shows a summary of CS performance on assessments from all six formative evaluation sections and the mean percentage of correct responses per didactic lesson or sub-module.
Fig. 4

Mean percentage of correct responses per sub-module

Even though the mean percentage of correct responses was less than 60% for only one lesson, sub-module 2.4, several questions in different lessons (Appendix 4) were also below this threshold, making additional revisions necessary for most lessons. These individual items were checked for clarity of language, appropriate stems, believable distractors, and clear instructions.

The quality of the assessment items was also evaluated through the use of a survey. The survey consisted of five statements related to the assessment items that CSs were asked to agree or disagree with. The options ranged from “strongly disagree” to “strongly agree,” which were coded in a Likert scale from 1 to 5 during the data analysis, making a higher mean a positive outcome and a lower mean a negative outcome. Figure 5 shows that the CSs were generally satisfied with the assessment items used to measure their learning from the video tutorials.
Fig. 5

Quality of assessment items. Bars represent average score per sub-module and error bars represent one standard deviation from the mean

Research question 2: Does the format of the learning materials used in the CS curriculum deliver the content effectively?

In addition to the evaluation of the items in the assessments, the quality of the video tutorials was evaluated through the use of a similar survey presented to CSs after they had viewed each video and completed the corresponding assessment. The results related to quality of video tutorials are presented in Fig. 6. Overall, the videos were well-received and saw favorable ratings from Citizen Scientists. It is worth noting that videos receiving lower ratings were videos where the group felt that the presenter spoke too quickly. The project team feels that this issue was remedied in the delivery format of the final product: individual viewings of each video with access to the pause and rewind features in YouTube. Those learner control features were not available to CSs during the group formative evaluation sessions, which may account for the lower scores on some videos.
Fig. 6

Quality of video tutorials. Bars represent average score per sub-module and error bars represent one standard deviation from the mean


The design approach used in the curriculum development allows CSs to learn via thought questions, learning objectives, video tutorials, learner-controlled navigation, and practice assessments with elaborative feedback; and enables them to use this experience as a “hook” to engage in face-to-face interactions with instructors, other CSs, and researchers. The opposite is also possible: face-to-face meetings and classroom activities can enhance the interaction CSs have with the web-based curriculum content. Table 3 outlines the research questions and corresponding answers based on the findings of this research.
Table 3

Research questions and answers

Research questions


a) Do the learning materials and pedagogy implemented in the CS curriculum create the depth and breadth necessary to prepare CSs to collaborate in clinical research?

The elaboration of video tutorials and validation of assessment items by SMEs provided the necessary evidence that the curriculum would serve CSs well in terms of the information they should have in order to become sufficiently knowledgeable about a given area to productively participate as a stakeholder in clinical research. The cyclical revision of assessment items based on set performance goals was crucial to ensure that CSs would achieve minimal understanding of the content presented, including the ability to transfer factual information to more ill-defined situations introduced through the scenarios used in the assessment items. Some of the content overlaps by design to ensure that CSs are met where they are at. The approach towards the facilitation of the learning objectives offers a more customized learning experience to current and future CSs.

b) Does the format of the learning materials used in the CS curriculum deliver the content effectively?

The availability of different learning materials (e.g., video tutorials, assessment items, and face-to-face support by the instructor) were all carefully crafted to represent the types of knowledge and skills that CSs need to engage in activities related to clinical and translational research. The materials used were viewed favorably overall when tested in formative evaluation sessions.

Lessons Learned

The instructional design processes described in this article were successful progressions for the creation of the final technology-enhanced learning solution: a self-paced, online OER that utilizes video- and assessment-based instructional packages across seven distinct topics. During this process, many challenges were encountered in designing and developing the instructional materials. First and foremost, the modified ADDIE (ADDEI) employed during this project proved to be an effective approach to address the constraints imposed by the allocated budget, tight deadlines, and busy schedules of all involved; therefore, it should be considered as a viable solution by other projects with similar constraints facing the same challenges.

Additionally, it is imperative to develop clear guidelines and procedures for the development of the video tutorials. The SMEs, who also served as video tutorial presenters, needed clear instructions to focus their content, guidance in how to write their teleprompter scripts, direction on how to design their slides, guidelines for best practices in recording, and often multiple takes to finalize their recording. It is equally important to have strong SME buy-in and commitment (Van Rooij 2010) to both the process and product to ensure the highest level of quality possible. For instance, at first, many of the SMEs did not want to write a script for their video tutorials. However, once the logistics of the video recording experience were described and project staffs offered to assist in writing the content, SMEs were often happy to make edits to a draft script rather than write one in its entirety. This process required clear and concise communication, patience, and staff dedication to coordinate multiple video tutorial slides and scripts concurrently.

The design of the assessment items for each didactic lesson should be unmistakably aligned with the learning objectives, presentation slides, and teleprompter script (Martone and Sireci 2009). It was critically important to use consistent language from the video tutorials in the assessment items to avoid confusion and unnecessary extraneous cognitive load among the learners during the practice assessment (Paas et al. 2003). In several cases, the instructional design team had to “wordsmith” the assessment stems and distractors to ensure consistency and clarity after the formative evaluation sessions with the CSs. Also, a challenge was creating consistent and effective slides with appropriate pictures, text, and animation effects that facilitate learning. SMEs without instructional design training tend to use unnecessary animations, redundant text, and decorational pictures, which can actually hinder the learning outcomes (Morrison et al. 2010).

A final consideration is the formative evaluation process, instruments, and instructions to the target learners involved in the procedures. Initially, the formative evaluation sessions were open for CS comments about the video tutorials and assessment items. While some of the open conversation was helpful, the project team quickly learned that the open discussion format often led to irrelevant topics and unhelpful points being discussed. It was later decided to have the CSs provide all of their feedback on the videos and assessments in paper form only. This change in process resulted in more focused formative evaluation data. While the lessons discussed here are not exhaustive, they do provide contextually rich guidance for others to consider in the implementation of similar projects using similar processes. Note that as this project was completed within a specific context, these lessons may be difficult to generalize.

Current Status and Next Steps

The final step for the CS Instructional Design Project included the creation of a detailed instructor guide to support the adoption of the curriculum by other CS groups. This guide features additional resources, discussion prompts that can be used in online or face-to-face instruction, and suggestions on how to integrate this content into existing institutional training resources. This instructor guide will be used to implement the curriculum with a small cohort of untrained CSs at UF, where the curriculum content has been ported into a Learning Management System (LMS) that will allow for a customized experience with clear performance metrics. The organization of the curriculum in a LMS was important to ensure the quality of the training, provide flexibility to instructors to incorporate additional materials, allow instructors to assess learning performance, provide a platform that learners can revisit as needed, and give learners the opportunity to regulate their own learning (Vovides et al. 2007). Following implementation of the curriculum with this cohort, a summative evaluation is planned. This will include individual interviews with members of this cohort, evaluation of their performance on the assessment items, and their overall impression of the quality of the video tutorials and learning materials. The similarity of these metric tools with those used during the formative evaluation will allow for assessment of quality and suitability of the curriculum for its target audience.

Closing Remarks

The involvement of lay people in clinical and translational research has proven to be an essential (Domecq et al. 2014) and cost-effective measure (Bonney et al. 2009), not only to collaborate with researchers who benefit from the unique perspectives that CSs bring to the research process, but especially for the body of knowledge that is generated based on unprecedented levels of contributions. The instructional materials developed specifically for new CSs can contribute to a wide adoption of translational science (Zerhouni 2005), making it a closer reality to community stakeholders and increasing the scope of scientific knowledge in meaningful ways. Although the design and development research case presented here is, to some degree, context-bound due to its application to a particular project (Richey and Klein 2007), it provides valuable insights on how instructional design research and practice can be used to support the development of robust, technology-enhanced curriculum materials to facilitate learning.



Research reported in this publication was supported in part by the OneFlorida Clinical Data Network, funded by the Patient-Centered Outcomes Research Institute #CDRN-1501-26692, in part by the OneFlorida Cancer Control Alliance, funded by the Florida Department of Health’s James and Esther King Biomedical Research Program #4KB16, and in part by the University of Florida Clinical and Translational Science Institute, which is supported in part by the NIH National Center for Advancing Translational Sciences under award number UL1TR001427. The authors would like to acknowledge the effort and assistance provided by the University of Florida Citizen Scientist Program members: Anastasia Anderson, Ravi Bhosale, Shirley Bloodworth, Quintina Crawford, Christy Evans, Myrtle Graham, Claudia Harris, Nathan Hilton, Janelle Johnson, Bill Larsen, Carlos Maeztu, and Nadine Zemon.

Compliance with Ethical Standards


The content is solely the responsibility of the authors and does not necessarily represent the official views of the Patient-Centered Outcomes Research Institute (PCORI), its Board of Governors or Methodology, the OneFlorida Clinical Research Consortium, the University of Florida’s Clinical and Translational Science Institute, the Florida Department of Health, or the National Institutes of Health.


  1. Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V., & Shirk, J. (2009). Citizen science: a developing tool for expanding science knowledge and scientific literacy. Bioscience, 59(11), 977–984.CrossRefGoogle Scholar
  2. Brown, K. L. (2003). From teacher-centered to learner-centered curriculum: improving learning in diverse classrooms. Education, 124(1), 49.Google Scholar
  3. Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: a theoretical synthesis. Review of Educational Research, 65(3), 245–281.CrossRefGoogle Scholar
  4. Clark, R. C., & Mayer, R. E. (2016). E-learning and the science of instruction: proven guidelines for consumers and designers of multimedia learning. Hoboken: John Wiley & Sons.CrossRefGoogle Scholar
  5. Domecq, J. P., Prutsky, G., Elraiyah, T., Wang, Z., Nabhan, M., Shippee, N., Pablo Brito, J., Boehmer, K., Hasan, R., Firwana, B., Erwin, P., Eton, D., Sloan, J., Montori, V., Asi, N., Abu Dabrh, A. M., & Murad, M. H. (2014). Patient engagement in research: a systematic review. BMC Health Services Research, 14, 89.CrossRefGoogle Scholar
  6. Gall, M. D. (1970). The use of questions in teaching. Review of Educational Research, 40(5), 707–721.CrossRefGoogle Scholar
  7. Gronlund, N. E. (2004). Writing instructional objectives for teaching and assessment. Upper Saddle River: Pearson/Merrill/Prentice Hall.Google Scholar
  8. Hartley, J., & Davies, I. K. (1976). Preinstructional strategies: the role of pretests, behavioral objectives, overviews and advance organizers. Review of Educational Research, 46(2), 239–265.CrossRefGoogle Scholar
  9. Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: a research paradigm whose time has come. Educational Researcher, 33(7), 14–26.CrossRefGoogle Scholar
  10. Johnstone, S. M. (2005). Open educational resources serve the world. Educause Quarterly, 28(3), 15.Google Scholar
  11. Martone, A., & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79(4), 1332–1361.CrossRefGoogle Scholar
  12. Mayer, R. E. (Ed.). (2005). The Cambridge handbook of multimedia learning. Cambridge: Cambridge university press.Google Scholar
  13. Morrison, G. R., Ross, S. M., Kemp, J. E., & Kalman, H. (2010). Designing effective instruction. Hoboken: John Wiley & Sons.Google Scholar
  14. Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: recent developments. Educational Psychologist, 38(1), 1–4.CrossRefGoogle Scholar
  15. Reeves, T. C. (2000). Socially responsible educational technology research. Educational Technology, 40(6), 19–28.Google Scholar
  16. Reeves, T. C. (2011). Can educational research be both rigorous and relevant. Educational Designer, 1(4), 1–24.Google Scholar
  17. Richey, R. C., & Klein, J. D. (2007). Design and development research: methods, strategies, and issues. Mahwah: Lawrence Erlbaum Associates.Google Scholar
  18. Richey, R. C., Klein, J. D., & Nelson, W. A. (2004). Developmental research: studies of instructional design and development. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 1099–1130). Mahwah: Lawrence Erlbaum Associates.Google Scholar
  19. Schwalbe, K. (2015). Information technology project management. Cengage Learning.Google Scholar
  20. Silvertown, J. (2009). A new dawn for citizen science. Trends in Ecology & Evolution, 24(9), 467–471.CrossRefGoogle Scholar
  21. Swarts, J. (2012). New modes of help: best practices for instructional video. Technical Communication, 59(3), 195–206.Google Scholar
  22. Sweller, J. (1994). Cognitive load theory, learning difficulty, and instructional design. Learning and Instruction, 4(4), 295–312.CrossRefGoogle Scholar
  23. Sweller, J., Van Merrienboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.CrossRefGoogle Scholar
  24. Van der Kleij, F. M., Feskens, R. C., & Eggen, T. J. (2015). Effects of feedback in a computer-based learning environment on students’ learning outcomes: a meta-analysis. Review of Educational Research, 85(4), 475–511.CrossRefGoogle Scholar
  25. Van Rooij, S. W. (2010). Project management in instructional design: ADDIE is not enough. British Journal of Educational Technology, 41(5), 852–864.CrossRefGoogle Scholar
  26. Vovides, Y., Sanchez-Alonso, S., Mitropoulou, V., & Nickmans, G. (2007). The use of e-learning course management systems to support learning strategies and to improve self-regulated learning. Educational Research Review, 2(1), 64–74.CrossRefGoogle Scholar
  27. Zerhouni, E. A. (2005). Translational and clinical science—time for a new vision. Retrieved from _medium=cm&utm_campaign=notablearticles16.

Copyright information

© Association for Educational Communications & Technology 2018

Authors and Affiliations

  • Natercia Valle
    • 1
  • Janet Brishke
    • 2
  • Albert D. Ritzhaupt
    • 1
  • Christy Evans
    • 2
  • David R. Nelson
    • 2
  • Elizabeth Shenkman
    • 2
  1. 1.School of Teaching and Learning, College of EducationUniversity of FloridaGainesvilleUSA
  2. 2.University of Florida College of MedicineGainesvilleUSA

Personalised recommendations