Skip to main content

Using Technology to Structure and Scaffold Real World Experiential Learning in Distance Education


Team projects allow students to apply their technical skills to a real-world context and build twenty-first century competencies, including teamwork, project management and communication skills. However, the complex challenges that such experiential learning projects present for students and faculty can limit the scale of implementation. This article argues that Virtual Business Projects (VBP), a model of team-based experiential learning where teams of students complete a virtual business project for an industry sponsor, can mitigate these problems by leveraging instructional technology and learning analytics. The VBP model is deployed in multiple universities, which have provided more than 2500 Virtual Business Projects since 2015. We will discuss how innovative technology, embedded in thoughtful learning design, supports experiential learning by taking advantage of features such as customizable scaffolding, automated reflection and feedback loops, and learning analytics.


The 4th industrial revolution, an era of human and machine augmentation (Bonciu 2017), presents complex challenges for educational institutions charged with preparing students for success. Recent studies suggest that the core competencies people need to be successful in this era are different from those needed for success in the 3rd industrial revolution, and that students are not adequately equipped with these skills before entering the workforce (Djankov and Saliola 2019; Wolff and Booth 2016). Employers are calling for a shift in what is taught and how it is taught, suggesting a transition towards pedagogies that immerse students in active and effortful work (Andrade 2016).

In 2015, the World Economic Forum (WEF) developed a twenty-first Century Skills framework that identified foundational literacies, competencies, and character qualities required for a successful career as we transition into the 4th industrial revolution (World Economic Forum 2015). In parallel, literature has emerged analyzing how different pedagogical practices like service-learning, work-integrated learning, and project-based learning support the development of these WEF twenty-first Century Skills (Ahuna et al. 2014; Morgan 2016; Snape 2017; Tasso et al. 2017). Team consulting projects are one such pedagogical practice employed in capstone courses in higher education (Case and Hoot 2019; Mattarocci and Ball 2018; Reidenberg and Long 2017). These capstones are referred to as capstone projects, capstone consulting teams, and various other names in the literature, but their nature is the same. Students work in teams to deliver a project for an industry client. For simplicity, we refer to them as Team Projects, and we use the term Virtual Business Project to refer to a technology-enabled Team Project.

These Team Projects are immersive, providing real-world contexts in which students have the opportunity to learn. In essence, they are authentic learning experiences; that is, learning experiences that are genuine and where learning is situated in real-life contexts (Karakaş-Özür and Duman 2019). The five standards of authentic learning introduced by Newmann and Wehlage (1993) and widely used in relevant research and literature (e.g., Karakaş-Özür and Duman 2019; Lock and Duggleby 2017; Luo et al. 2017) are higher order thinking, depth of knowledge, connectedness to the world, substantive conversation, and social support for achievement.

In order for students to extract as much learning as possible from such a rich learning experience, educators and instructional designers need to leverage experiential learning theory. One of the core tenets of experiential learning is “learning results from synergetic transactions between the person and the environment” (Kolb and Kolb 2009, p. 44). The transformation of experience into knowledge is achieved by the process of stepping through four learning modes known as the experiential learning cycle (Kolb and Kolb 2009). Moreover, feedback is an essential element of experiential learning, as it helps learners move through the four learning modes in the experiential learning cycle (Yeo and Marquardt 2015).

However, it is a complex undertaking to step each student through the four learning modes at key learning moments, while simultaneously facilitating feedback from industry sponsors, peers and educators in order to inject valuable insight into students’ knowledge extraction process. The complexity makes this type of learning experience costly to deliver. Even at low scale, educational providers must expend significant resources to identify key learning moments and address issues like team conflicts, decreases in confidence, learner disengagement, mismatched expectations, or inefficient collaboration between faculty, students, and industry participants. Thus, the administrative overhead to provide adequate student support and maintain quality at large scale, while simultaneously satisfying industry partner needs, can be cost-prohibitive for many education providers.

In this paper, we explore the challenges of delivering Team Projects, and how these challenges can be addressed by the effective use of the emerging technologies of the 4th industrial revolution (Almeida and Simoes 2019). Specifically, we focus on the use of data analytics and machine learning in the form of real-time learning analytics to augment facilitator instruction (James 2020). We present “Virtual Business Projects” (VBP), a technology-enabled delivery model that is designed to provide students with an opportunity to apply theoretical knowledge about technical skills and twenty-first Century competencies in a real-world context. The use of the technology to structure and automate operational tasks lowers the overhead and therefore, the cost of providing high quality, authentic, experiential learning in online and blended learning environments. VBP is a scalable form of team project where teams of students complete a business project for an industry sponsor, aided by instructional technology. Finally, we argue that although emerging technologies like machine learning and learning analytics have the potential to augment educators’ effectiveness, this potential can only be realized if technology, pedagogy, and course development are collaborative and integrated (Lockyer and Dawson 2012). The VBP model was developed through collaborative technology, pedagogy, and course design, and has been provided at scale to more than 6000 online and distance education students.

The Complexities of Team Projects

A Team Project is a type of authentic and experiential learning that provides an opportunity for university students to apply the skills and competencies they are learning in the classroom to a real-world industry project. This type of learning combines key aspects of collaborative projects and internships, two high impact practices that have been widely tested and endorsed for their ability to promote active learning (Thorington Springer et al. 2019). It can also provide access to some of the benefits of an internship to students who are not able to take advantage of full internship or co-op opportunities; for example, adult, non-traditional students who are already working while in school, often in lower-level jobs. Industry sponsors provide the projects, bringing real-world context that adds authenticity and richness to the learning experience. Students get the opportunity to gain insight into different industries, get professional feedback on their work, develop a professional network, and produce artifacts that can be used in recruitment settings (Burns and Chopra 2017). Using this type of learning in the curriculum is not a new concept (McCubbins et al. 2016; National Research Council 2012). However, the complex delivery mechanics, time investment, and consequent cost of delivery have meant they are predominantly used as premium learning experiences for advanced students (Beckem and Watkins 2012).

The richness of a team project experience increases the complexity of the work students must do. Rather than merely explaining ideas and concepts as they might do in an essay, students have to apply these ideas and concepts to a new situation, which increases the level of cognitive ability required for success (Irvine 2017). The ability to take knowledge and information acquired in one context and apply it to another is called transfer, and is influenced by a student’s ability to connect past learning with current and future contexts (Jackson et al. 2018). There is little agreement on the nature of transfer in the academic literature, let alone how it should be examined or measured. The cognitive perspective of transfer examines the learner’s “mental representations of relations among objects” (Lobato 2012, p. 233). By contrast, the situative perspective examines the learner’s interaction with other individuals and the environment (Greeno 1997, 2006). Despite the debate about the nature of transfer, how it works, and how it is examined (Lobato 2012), in authentic and experiential learning programs students’ success is reliant on their ability to transfer theoretical knowledge from the classroom to a real-life situations.

In fact, in a Team Project, students are not only transferring theoretical knowledge into actions, they are transferring theoretical concepts acquired individually in a classroom context to a project in a real-life business context. The National Research Council asserts that applying knowledge learned in the classroom to career performance is far transfer (2015), and is a greater cognitive leap than applying knowledge to a classroom activity or writing a theoretical paper for academic assessment. Applying Barnett and Ceci’s (2002), taxonomy of transfer, students participating in a Team Project are transferring knowledge domains, physical context, social context, functional context, and modality all at once.

Furthermore, Team Project success often requires aggregation of new information with students’ existing knowledge. Aggregation of new knowledge to one’s existing knowledge base and transfer of knowledge to a new context are each complex mental processes individually. This complexity is multiplied when both mental processes need to happen in parallel.

In addition to this complex mental process, students are working in a team. The ability to work in teams and collaborate effectively is a highly sought after competency for employers (Ritter et al. 2017), which has resulted in team assignments and projects being increasingly prevalent in higher education curricula (Borstler and Hilborn 2016; Burrell et al. 2015; Hobson et al. 2014). Like any technical skill, teamwork is built through practice and intentional development. However, team assignments and projects are often introduced into the curriculum without effective instruction (Lingard 2010). Embedding teamwork into the curriculum without adequate support and theoretical frameworks to aid understanding of teamwork can result in students developing bad teamwork habits that could have a negative impact on their success as they move through their career. Depending on the academic level and life experience of each student, each student on a team could be anywhere from novice to expert on the Dreyfus Model of skill acquisition (Glover et al. 2018) for each of the skills required to work effectively in a team and to complete a project successfully.

Faculty must meet the challenge of supporting these complex student learning needs while managing the additional layer of complexity created by the introduction of an industry sponsor into the learning collaboration (Lawson et al. 2011). Before the start of the course, industry sponsors need to be recruited, and potential projects evaluated to ensure they are aligned with learning outcomes and students’ experience level. Once the course starts, faculty need to monitor industry sponsors to ensure they are getting good outcomes. Simultaneously, faculty need to ensure that student teams get the information, feedback, and support they need from the industry sponsor to effectively deliver those outcomes.

Despite technology being a key to scale, the complexity of Team Projects is magnified when implemented in the context of distance and online education. Students have the additional challenge of managing their team collaboration and project delivery virtually, and in some cases, asynchronously due to time zone challenges.

The Potential of Experiential Learning Technology to Enable Quality at Scale

In order to deliver Team Projects at a scale that makes them cost-effective while maintaining the quality of the learning experience, the challenges mentioned above need to be overcome, and use of emerging technologies may be a way to do it. Emerging technologies like machine learning and real-time learning analytics hold the potential to support students at different levels, monitor reflective activities in order to highlight learning challenges, and automate the operational tasks required to leverage the use of feedback in instruction and to monitor engagement. This ability to automate operational tasks and provide data and insight to faculty enables faculty to provide more tailored support to students (James et al. 2018). Crawley et al. (2009), identify that the transition from face-to-face learning to online or technology-enabled learning results in a loss of affective cues used to intuit learner needs. Research that integrates learning analytics and learning science indicates that real-time learning analytics enabled by machine learning could offer a replacement for affective cues lost in technology-enabled learning (James 2020).

In order to successfully enable the delivery of Team Projects at scale without compromising on quality outcomes, a technology tool should be built with authentic learning and experiential learning theory as core tenets of the design. Furthermore, the pedagogical underpinnings of the theory need to be reflected in the design’s features and functionality. Specifically, a desirable technology solution needs to support the scaffolding of learning content and instructor support, reflection exercises, industry and peer feedback, and participant engagement.

Scaffolding of Learning Content and Instructor Support

Student participants extract valuable learning from experience by performing three main tasks:

  1. 1.

    applying domain knowledge to a real-world project,

  2. 2.

    managing team collaboration,

  3. 3.

    demonstrating effective project and industry sponsor management.

All of these tasks are complex cognitive processes that are best learned through experience (Nenzhelele and Pellissier 2014; Proctor and Van Zandt 2018; Wilton 2011). However, learning all of them in parallel is a challenge and likewise, supporting students through the process is a challenge for faculty, especially with individual students at different proficiency levels for any given skill at any given point in time. Learning analytics explore and make meaning out of data points in a way that enhances understanding of the learner, their learning process, and the learners they are collaborating with (Baker and Inventado 2014; Dawson et al. 2016; Long et al. 2011; Siemens 2013). These insights can be used to decide when elements of support scaffolded into the instructional design can be removed for individual students. This technology-enabled process can augment the faculty’s ability to accelerate advanced students and provide more support for those who need it.

Reflection Exercises

One of the learning modes in the experiential learning cycle is reflective observation. Reflective observation focuses on making meaning out of the current or past experience. It is followed by an abstract mental process of combining reflective observations with a student’s existing knowledge base, in order to generate new knowledge or greater understanding. This new knowledge or understanding is then applied through active experimentation (Kolb and Kolb 2009). A learner can extract learning from experience; however, stepping through all four phases of the experiential learning cycle optimises the potential for knowledge extraction and new understanding.

Instructional technology can be used to trigger reflective practice at moments when there is likely to be a problem of practice that might lead to valuable knowledge extraction. Additionally, the instructional technology can be used to scaffold the reflective writing tasks, initially stepping students through each phase of the experiential learning cycle in isolation, then leaving the task more freeform as students develop the capability to step through the process on their own. Real-time learning analytics processes enabled by machine learning can analyze reflective writing in real-time to provide students with insights on their reflective process and how to develop it (Buckingham Shum et al. 2017).

Industry and Peer Feedback

Reflection and reflective assignments are prevalent in higher education (Wong 2016). However, scholars claim that effective reflective practice requires the ability to identify that a problem exists (Newman 2018). Without industry and peer feedback, meaningful reflections are a difficult proposition for students who have limited reference points from professional practice and are often novices in the skill they are attempting to develop.

Feedback is a core instructional tool used to help students develop their knowledge, skills, and abilities (Brooks et al. 2019). In a traditional learning environment, feedback is predominantly given by faculty and received by students. All learning management systems support this traditional feedback loop. However, in a Team Project, the most valuable feedback is provided by industry sponsors who can provide industry insights, and by peers who can help each other with learning twenty-first Century Skills. Moreover, the ability to give and use feedback is a valuable professional skill (Donia et al. 2018). Nevertheless, facilitation of peer feedback and industry feedback can be labor intensive for faculty, particularly as teacher-student ratios increase (Bailey and Garner 2010). The labor intensity of administering feedback loops results in the giving and receiving of feedback being relegated to an informal part of the learning experience, which ignores some of the core value propositions of industry engagement and team projects.

Instructional technology can structure and automate industry sponsor and peer feedback in the course design, making feedback more accessible and scalable as an instructional tool. Moreover, text mining and sentiment analysis can be used to highlight negative feedback that helps faculty to identify when industry partners or students are unhappy, or when there is team conflict. The real-time identification of these issues can enable faculty to intervene and provide support on the individual and team level at the right moment, rather than finding out about issues retrospectively through post-program surveys or interviews.

Participant Engagement

Learner engagement is a lead indicator of learner success (Groccia 2018), and is defined as “students’ cognitive investment in, active participation with, and emotional commitment to learning particular content” (Bender 2017, p. 2). Driving engagement is particularly meaningful in online and distance education, where interactions and communication between students and teachers are reduced (Lee et al. 2019). Literature provides examples of pedagogical tools that can be used for engagement, including experiential and project-based learning themselves (Li et al. 2017). However, in order for them to have the desired effect, they need to be well designed. As the definition highlights, engagement is driven at three levels: action, emotion, and cognition. Instructional technology could support engagement at all three levels in various ways.

For action-based engagement, instructional technology should support gamification mechanics to provide students and industry sponsors with extrinsic motivation triggers for the desired behavior. Use of completion tracking, badges, and points can drive active engagement. On the deeper emotional and cognitive level, real-time learning analytics can be used to give faculty insight into emotional connectedness and confidence in the project. The learning analytics allow faculty to intervene in the learning experience in real-time to resolve issues that may be causing students to lose confidence or disengage from the experience altogether.

A Successful Example: Showcasing ‘Virtual Business Projects’

A Virtual Business Project (VBP) is a model of team project that is underpinned by Kolb’s experiential learning theory, integrates pedagogy, curriculum and technology, and is delivered digitally. In a VPB, students work in teams to deliver a real-world project to an industry sponsor. It is a form of capstone project designed specifically for online and technology-enabled learning.

In the mobilization phase of a VPB, faculty and program managers collaborate with industry partners to design projects that meet industry sponsor objectives while also aligning with the intended learning outcomes, students’ ability level, and the amount of time students have to work on the project. Once the program kicks off, teams of students are assigned to an industry project and participate in an online briefing meeting to gain an understanding of the project and industry sponsor expectations, and to build team connectedness and project confidence. After the briefing meeting, students develop a project plan that re-defines the project, highlighting team roles, deadlines, and meetings with the industry sponsor. Once the project plan is submitted to the industry sponsor for feedback and approval, student teams execute the project, submitting it for feedback to the industry sponsor multiple times throughout the process to gain insight, build buy-in, and ensure the project is aligned to expectations. These industry feedback loops form an invaluable part of the experience, as they help students identify problems, which in turn leads to more effective reflection exercises. In the final stages of the project, student teams develop a project report that is delivered to the industry sponsor in an online presentation.

In parallel with the project delivery process, students work collaboratively to understand better and develop their teamwork skills. Before starting the project, they evaluate themselves on a list of teamwork skills, and identify how they plan to develop their weaker skills in the first phase on the project. After each project submission, student teams complete a follow-up self-evaluation and a peer evaluation of each of their team members. Once this process is complete, students receive an automated report that compares their self-assessment with peer assessments and aggregates constructive feedback from their peers. The students review this feedback, using it to identify which skills they plan to focus on developing in the next phase of the project and how they are going to develop them.

Due to synergy between the learning design and purpose-built technology, a small team of program administrators were able to deliver high-quality VBPs to more than 6000 students in the last two and a half years. In this showcase, we want to demonstrate how the Virtual Business Project model leverages Practera, an emerging experiential learning technology platform, to scaffold learning content, support reflective learning, automate feedback, and drive student and industry sponsor engagement.

How Integrating Technology into the Instructional Design Supported High-Quality VBP at Scale

In all VBPs, learners are provided with an app that guides them through the learning program and recommends the right actions at critical learning moments. The learning design includes built-in points for students to engage in reflection, which are triggered by Practera’s ‘pulse check’ functionality and feedback from industry sponsors or peers. Students then capture their intended improvement points in a development plan that is accessible by faculty. All of the feedback, reflection triggers, and development plan processes are embedded in the course design. The technology sequences and automates the operational tasks required to extract and distribute project work, feedback, and development plans. Leveraging the technology to automate the process frees faculty from the operational tasks and gives them the data they need to understand each student’s progress, so that they can provide tailored support.

Instructors are supported by having access to real-time learning analytics and an automated experiential learning support assistant to enable meaningful facilitation, even though they are not physically engaging with students throughout the course. The technology platform leverages insights from learning analytics research, processing the data in real-time and displaying it in a way that helps faculty identify students who need support in order to improve their performance (James et al. 2018).

VBPs’ design highlights potential learning moments where students and teams may benefit from faculty support. Once students are in the VBP program, the technology analyses individual, team, and industry sponsor data points to identify when students are reaching these learning moments; it then notifies faculty. For example, faculty are notified when students are exhibiting a negative sentiment towards the project, student teams are experiencing dissonance, or industry sponsors are disengaged. Once a learning moment is identified, faculty can use detailed insights presented about each student, team, or sponsor to personalize recommendations and support (Pardo et al. 2017). After faculty action, the intervention is logged to measure its efficacy and impact on the people involved.

Scaffolding of Learning Content and Instructor Support

The learning design of a Virtual Business Project allows for customizable scaffolding and instructor support to give students the right amount of support at the right time to perform their three essential tasks.

  1. 1.

    applying domain knowledge to a real-world project

VBPs require students to engage in the transfer of information from theoretical content from the course, aggregate this new information with existing knowledge, and apply it to the real-world context of the VBP. Individual students in the class will have different existing knowledge bases.

Faculty can use the technology to control the hiding and unhiding of supportive learning content on an individual, team, or cohort level. They can decide to hide or reveal content based on their evaluation of students’ ability and on feedback from industry sponsors and peers. Furthermore, they can transfer the decision to the students themselves, enabling students to take more control of their VBP program. The technology allows faculty to provide the right learning content at the right time in the project, so that students can effectively transfer theoretical knowledge to the project context in real-time.

  1. 2.

    managing team collaboration,

The VBP model has a teamwork skill development process integrated into the design. Students complete a teamwork skill self-assessment and development plan before the project starts. After each project submission, students redo their self-assessment, complete a peer assessment on the same teamwork skills, and update their development plan based on evaluation and feedback from their peers. The teamwork skill development process itself provides a structure that helps students better understand how their peers receive their efforts to collaborate and work effectively in a team. Additionally, it provides a platform to discuss differences in expectations using a common language.

Furthermore, students and industry sponsors complete a micro-reflection called a ‘pulse check’ after each project submission. A pulse check is a series of three questions that students and industry sponsors have to answer before progressing into the next phase of the project. Using real-time learning analytics, Practera aggregates each team’s pulse check to unearth the team’s overall health when it comes to collaboration and project confidence. Together, the ‘pulse check’ process and the self- and peer-assessment process provide data that faculty can use to gain insight about the cohesiveness of the team, their project confidence, and engagement levels. Faculty can use this data to further support teams’ teamwork and collaboration skill development in real-time.

  1. 3.

    demonstrating effective project and sponsor management

In addition to effective collaboration within their project team, students working on VBPs need to meet project deadlines, keep industry sponsors up to date with the project, and manage sponsor expectations. Students who already have professional experience may be competent, or even experts, in project management and client management. However, for students without previous professional experience, especially those early in their degree programs, these are professional skills they can develop as part of the project experience.

In a VBP, sponsor management and project management are included as learning objectives and designed into the assessment structure and learning content. The entire project, with its associated learning content, is structured using project management principles. In addition to learning content about the technical skills required to complete the project, students gain access to just-in-time learning content on how to effectively manage the project and sponsor expectations at that particular phase of the project. The learning content, delivered by industry professionals, suggests tasks that may facilitate more effective project or sponsor management, and explains why these particular tasks will likely lead to a better overall project outcome.

Building project and sponsor management into the learning objectives and the instructional design provides students with the opportunity to learn these skills experientially while working on the project. In addition, the provision of structured, front-loaded learning content and tips enables them to leverage the experience of others to start at a higher level of proficiency. This structuring of learning content into the project delivery means that students can get the benefit of the industry engaged learning earlier in their academic program, without risk of developing a bad professional reputation.

The benefits of building project management and sponsor management into the learning objectives extend beyond students to the faculty and industry sponsor. When student teams can effectively manage their project, and engage their sponsor through timely communication and reporting, it is easier for the industry sponsor to stay aware of how the project is progressing. This communication channel enables industry partners to provide expert insight that will help students complete more valuable work and ensure the project is staying aligned to the project brief.

Reflection Exercises

VBP’s include micro-reflections on team progress and project confidence as well as, written (or video) reflection on teamwork skill development. Micro-reflection questions are triggered when students submit project deliverables, and again after they review industry sponsor feedback on the deliverables. These automated micro-reflection questions are designed to facilitate a transfer from the concrete experience to the reflective observation mode of experiential learning. More extensive written reflection in the form of skill development plans is used for teamwork skill development. After each project submission, students re-evaluate their teamwork skills, reflect on how their implementation of their skill development plan went, and plan their skill development for the next phase of the project.

In the first phase of the project, the reflection exercise is highly scaffolded, explicitly stepping students through reflective observation, abstract conceptualization, and planning for active experimentation. As the project progresses, faculty can remove the scaffolding on an individual, team or cohort basis, depending on their students’ needs.

Throughout a three-week VBP, each student completes six micro-reflections, two skill development plans, and one post-project reflection. Additionally, industry sponsors complete a micro-reflection after each time they provide feedback, three in total. Data from all of these micro-reflections and skill development plans are analyzed by the instructional technology to identify team dissonance, project confidence, and other collaboration issues. Faculty can use the insights from this analysis to provide tailored support to individual students, teams, and industry sponsors.

Industry and Peer Feedback

In a three-week VBP, student teams submit a project plan, a draft project report, and a final project report for feedback from industry sponsors. Following submission, industry sponsors are notified, review the deliverable, and respond with both rubric ratings and written feedback. After industry sponsors finalize the review, student teams are notified, look at the feedback, rate the usefulness of the feedback, and thank their industry sponsor for their insight. Additionally, after each project submission, students complete a peer review process in which they provide both ratings and written feedback on their team members’ teamwork skills.

Throughout a VBP, a team of five students and one industry sponsor generates three industry sponsor-to-team feedback loops, 15 student-to-industry sponsor feedback loops and 60 peer-to-peer feedback loops. All of these feedback loops are automated and analyzed using technology. The analysis allows faculty to review potentially problematic feedback by exception and invest their time in providing additional support, instead of operationally executing the feedback loops themselves.

The automation of feedback by the technology allows for larger volumes of formative feedback in large cohorts of students, without a significant increase in operational time investment for faculty. The formative feedback can then be used as an instructional tool to help students develop their skills (Brooks et al. 2019), while also providing multiple opportunities for students to develop their ability to give and receive feedback (Donia et al. 2018). Additionally, faculty can use the students’ helpfulness ratings to improve industry feedback rubrics and perhaps even select appropriate industry sponsors for different cohorts of students.

Participant Engagement

Engagement is a pre-requisite for knowledge extraction from a learning experience (Groccia 2018), and is driven by action, emotion and cognition. VBPs drive active student engagement using game mechanics, including achievement badges and progress bars. Badges and unlocking conditions are set up during the design of the learning experience. In order to drive good learning behaviors, the unlocking conditions can be based on completion of critical tasks like project submissions, or on completion of essential learning content.

The emotional and cognition-based engagement mechanics of the VBP model are in their infancy stage and are primarily driven by the data captured from the action-based engagement mechanics, reflection exercises, and feedback loops. Faculty can use the indicators of team dissonance and project confidence to provide timely support interventions to students, teams, and industry sponsors. These interventions are trackable in the instructional technology, enabling educators to review all of their support interventions at the end of the VBP, and to reflect on which support interventions had an impact on student engagement and learning.


As the showcase of the Virtual Business Project model demonstrates, a technology-enabled delivery model exists which enables education providers to offer authentic, experiential learning at a level of quality that, until now, was cost-prohibitive at scale. Emerging technologies like machine learning and real-time learning analytics hold the potential to support scaffolding of learning, monitor reflective activities, and automate the operational tasks required to better leverage the use of feedback and participant engagement in online authentic and experiential learning. Moreover, the VBP model enables cost-efficient scale by systematically embedding team projects into online and distance education programs.

In order to adopt emerging technology and experiential learning programs like VBP and advance the field of experiential learning, we recommend that faculty and instructional designers integrate technology into the pedagogical development of courses. Furthermore, they should use the Virtual Business Project model earlier in degree programs to develop students’ ability to extract learning from experience, and do this in collaboration with learning analytics researchers to further improve the capabilities of real-time learning analytics for supporting experiential learning.

Integrate Emerging Technology with Pedagogy

Emerging technologies provide the most value when they are integrated explicitly with the design of the learning experience. Faculty and instructional designers should embrace new technology and collaborate with engineers and data scientists through consistent feedback to help developing instructional technology that enables active and authentic learning at scale. Research suggests that these collaborations enable data-driven curriculum redevelopment (Lockyer and Dawson 2012). However, learning analytics would need to be embedded into the course design in order to collect appropriate, accurate and useful data (Kovanović et al. 2017).

The Virtual Business Project’s model has and continues to be iteratively developed by a consortium of learning designers, faculty, engineers, and learning analytics researchers. The development of the VBP model has held in tension the quality of the learning experience for individual students and the need for scalability. Throughout the various iterations of the VBP model, both the pedagogy and instructional technology itself have been changed in order to maximize student learning. As experiential learning and online delivery both continue to increase in popularity, more models of experiential learning need to be purpose-designed for the online paradigm.

Use the VBP Model to Develop Transfer and Knowledge Extraction Capability

The Virtual Business Project model is designed to support the transfer of theoretical knowledge to a real-life business project by decreasing the complexity of the far transfer on the temporal, functional, and knowledge domain of transfer (Barnett and Ceci 2002); step students through the experiential learning cycle in order to maximize their knowledge extraction; and facilitate the development of teamwork, project management, and sponsor management skills through structured peer and sponsor feedback.

Implementing VBPs early in an undergraduate or graduate degree allows students to develop these capabilities at the beginning of their program. Providing opportunities to develop the ability to transfer knowledge and extract learning from experience will help students draw more learning out of other experiential learning programs during their degree program, for example, co-ops, internships, or capstone projects. Moreover, developing effective teamwork, project management, and sponsor management skills could put students in a better position to be able to turn industry engaged learning experiences into career opportunities.

Collaborate with Learning Analytics Researchers

Learning analytics is an emerging field of education technology research that leverages data sets from instructional technology to understand learning. The literature acknowledges a gap in learning analytics research that is underpinned by learning theory and learning science (Avella et al. 2016; Gašević et al. 2015; Kirkwood and Price 2014; Lodge and Corrin 2017; Lockyer et al. 2013; Reimann 2016). Effective collaboration between instructional designers, faculty, and learning analytics researchers can lead to the development of more models of experiential learning that integrate technology into the pedagogy. Moreover, such efforts can provide insights to inform the development of instructional technology, so that it can better support authentic and experiential learning in online and distance education.


  1. Ahuna, K. H., Tinnesz, C. G., & Kiener, M. (2014). A new era of critical thinking in professional programs. Transformative Dialogues: Teaching & Learning Journal, 7(3), 1–9.

    Google Scholar 

  2. Almeida, F., & Simoes, J. (2019). The role of serious games, gamification and industry 4.0 tools in the education 4.0 paradigm. Contemporary Educational Technology, 10(2), 120–136.

    Article  Google Scholar 

  3. Andrade, M. S. (2016). Curricular elements for learner success—21st century skills. Journal of Education and Training Studies, 4(8), 143–149

    Article  Google Scholar 

  4. Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits and challenges in higher education: a systematic literature review. Online Learning, 20(2), 13–29.

    Google Scholar 

  5. Bailey, R., & Garner, M. (2010). Is the feedback in higher education assessment worth the paper it is written on? Teachers’ reflections on their practices. Teaching in Higher Education, 15(2), 187–198.

    Article  Google Scholar 

  6. Baker R.S., & Inventado P.S. (2014) Educational data mining and learning analytics. In: J. Larusson & B. White B (Eds.), Learning analytics (pp. 61–75). Springer.

  7. Barnett, S. M., & Ceci, S. J. (2002). When and where do we apply what we learn? A taxonomy for far transfer. Psychological Bulletin, 128(4), 612–637.

    Article  Google Scholar 

  8. Beckem, J. M., & Watkins, M. (2012). Bringing life to learning: immersive experiential learning simulations for online and blended courses. Journal of Asynchronous Learning Networks, 16(5), 61–70.

    Google Scholar 

  9. Bender, W. N. (2017). 20 strategies for increasing student engagement. Learning Sciences International.

  10. Bonciu, F. (2017). Evaluation of the impact of the 4th industrial revolution on the labour market. Romanian Economic and Business Review, 12(2), 7–16.

    Google Scholar 

  11. Borstler, J., & Hilborn, T. B. (2016). Team projects in computing education. ACM Transactions on Computing Education, 16(2).

  12. Brooks, C., Carroll, A., Gillies, R. M., & Hattie, J. (2019). A matrix of feedback for learning. Australian Journal of Teacher Education, 44(4).

  13. Buckingham Shum, S., Sandor, A., Goldsmith, R., Bass, R., & McWilliams, M. (2017). Towards reflective writing analytics: rationale, methodology and preliminary results. Journal of Learning Analytics, 4(1), 58–84.

    Article  Google Scholar 

  14. Burns, C., & Chopra, S. (2017). A meta-analysis of the effect of industry engagement on student learning in undergraduate programs. Journal of Technology, Management, and Applied Engineering, 33(1), 1–20.

    Google Scholar 

  15. Burrell, A. R., Cavanagh, M., Young, S., & Carter, H. (2015). Team-based curriculum design as an agent of change. Teaching in Higher Education, 20(8), 753–766.

    Article  Google Scholar 

  16. Case, D. M., & Hoot, C. (2019). Capstone as consulting. Journal of Computer Sciences in Colleges, 34(4).

  17. Crawley, F. E., Fewell, M. D., & Sugar, W. A. (2009). Researcher and researched: the phenomenology of change from face-to-face to online instruction. The Quarterly Review of Distance Education, 10, 165–176.

    Google Scholar 

  18. Dawson, S., Drachsler, H., & Rose, C. P. (2016). LAK ‘16: Proceedings of the sixth international conference on learning analytics & knowledge. ACM.

  19. Djankov, S., & Saliola, F. (2019). The changing nature of work. Journal of International Affairs, 72(1), 57–74.

    Google Scholar 

  20. Donia, M. B. L., O’Neill, T. A., & Brutus, S. (2018). The longitudinal effects of peer feedback in the development and transfer of student teamwork skills. Learning and Individual Differences, 61, 87–98.

    Article  Google Scholar 

  21. Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: learning analytics are about learning. Tech Trends, 59(1), 64–71.

    Article  Google Scholar 

  22. Glover, R., Hammond, N. B., Smith, J., & Guerra, D. (2018). Assessing peer leader skill acquisition and group dynamics in a first-year calculus course. International Journal for the Scholarship of Teaching and Learning, 12(1).

  23. Greeno, J. G. (1997). On claims that answer the wrong questions. Educational Researcher, 26(1), 5–17.

    Article  Google Scholar 

  24. Greeno, J. G. (2006). Authoritative, accountable positioning and connected, general knowing: progressive themes in understanding transfer. The Journal of the Learning Sciences, 15(4), 537–547.

    Article  Google Scholar 

  25. Groccia, J. E. (2018). What is student engagement? New Directions for Teaching & Learning, 2018(154), 11–20.

    Article  Google Scholar 

  26. Hobson, C. J., Struck, D., Griffin, A., Szostek, J., & Rominger, A. S. (2014). Teaching MBA students teamwork and team leadership skills: an empirical evaluation of a classroom education program. American Journal of Education, 7(3), 191–212.

    Google Scholar 

  27. Irvine, J. (2017) A Comparison of revised Bloom and Marzano’s new taxonomy of learning. Research in Higher Education Journal, 33.

  28. Jackson, D., Fleming, J., & Rowe, A. (2018). Student transfer of skills and knowledge across university and work contexts. In K. E. Zegwaard & M. Ford (Eds.), Refereed Proceedings of the 3rd International Research Symposium on Cooperative and Work-Integrated Education, Stuttgart, Germany (pp. 65–72). World Association of Cooperative Education.

  29. James, N. (2020). Technology-enabled categorisation of learners for improved support in experiential learning [Unpublished doctoral dissertation]. University of Liverpool.

  30. James, N., Kovanovic, V., Marshall, R., Joksimovic, S., & Pardo, A. (2018). Examining the value of learning analytics for supporting work-integrated learning. Australian Collaborative Education Network Conference, Sydney, Australia, (pp. 55-61) Australian Collaborative Education Network Limited.

  31. Karakaş-Özür, N., & Duman, N. (2019). The trends in authentic learning studies and the role of authentic learning in geography education. International Education Studies, 12(12), 28–42.

    Article  Google Scholar 

  32. Kirkwood, A., & Price, L. (2014). Technology-enhanced learning and teaching in higher education: what is ‘enhanced’ and how do we know? A critical literature review. Learning, Media and Technology, 39(1), 6–36.

    Article  Google Scholar 

  33. Kolb, A., & Kolb, D. (2009). Experiential learning theory: A dynamic, holistic approach to management learning, education and development. In S. J. Armstrong & C. V. Fukami (Eds.), The SAGE handbook of management learning, education and development (pp. 42–68). SAGE Publications Ltd..

  34. Kovanović, V., Joksimović, S., Gašević, D., & Siemens, G. (2017). Digital learning design framework for social learning spaces. Y. Bergner, C. Lang, G. Gray, S.D. Teasley & J.C. Stamper (Eds.). Joint Proceedings of the Workshop on Methodology in Learning Analytics (MLA) and the Workshop on Building the Learning Analytics Curriculum (BLAC), Vancouver, Canada. Society of Learning Analytics Research.

  35. Lawson, R., Fallshaw, E., Papadopoulos, T., Taylor, T., & Zanko, M. (2011). Professional learning in the business curriculum: engaging industry, academics and students. Asian Social Science, 7(4), 61–68.

    Article  Google Scholar 

  36. Lee, J., Song, H. D., & Hong, A. H. (2019). Exploring factors, and indicators for measuring students' sustainable engagement in e-learning. Sustainability, 11.

  37. Li, H., Ochsner, A., & Hall, W. (2017). Application of experiential learning to improve student engagement and experience in a mechanical engineering course. European Journal of Engineering Education, 44(3), 283–293

    Article  Google Scholar 

  38. Lingard, R. W. (2010). Teaching and assessing teamwork skills in engineering and computer science. Systemics, Cybernetics and Informatics, 8(1), 34–37.

    Google Scholar 

  39. Lobato, J. (2012). The actor-oriented transfer perspective and its contributions to educational research and practice. Educational Psychologist, 47(3), 232–247.

    Article  Google Scholar 

  40. Lock, J., & Duggleby, S. (2017). Authentic learning in the social studies classroom: connecting globally. One World in Dialogue, 4(1), 20–27.

    Google Scholar 

  41. Lockyer, L., & Dawson, S. (2012). Where learning analytics meets learning design. In S. Dawson & C. Haythornthwaite (Eds.), Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 14–15). New York: Association for Computing Machinery.

    Google Scholar 

  42. Lockyer, L., Heathcote, E., & Dawson, S. (2013). Aligning learning analytics with learning design. The American Behavioral Scientist, 57(10), 1439–1459.

    Article  Google Scholar 

  43. Lodge, J. M., & Corrin, L. (2017). What data and analytics can and do say about effective learning. Science of Learning, 2(5).

  44. Long, P., Siemens, G., Conole, G., & Gašević, D. (Eds.). (2011). LAK ‘11: Proceedings of the 1st international conference on learning analytics and knowledge. New York: ACM

    Google Scholar 

  45. Luo, T., Murray, A., & Crompton, H. (2017). Designing authentic learning activities to train pre-service teachers about teaching online. The International Review of Research in Open and Distributed Learning, 18(7).

  46. Mattarocci, S., & Ball, D.R. (2018). Empowering youths and combating gang activity across long island through MBA capstone consulting [Paper Presentation]. In 2018 Conference Proceedings Northeast Business and Economics Association, Galloway, New Jersey (pp. 196–199). Northeast Business and Economics Association.

  47. McCubbins, O. P., Paulsen, T. H., & Anderson, R. G. (2016). Student perceptions concerning their experience in a flipped undergraduate capstone course. Journal of Agricultural Education, 57(3), 70–86.

    Article  Google Scholar 

  48. Morgan, C. (2016). Testing students under cognitive capitalism: Knowledge production of twenty-first century skills. Journal of Education Policy, 31(6), 805–818.

    Article  Google Scholar 

  49. National Research Council. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. The National Academies Press.

  50. National Research Council. (2015). Reaching students: What research says about effective instruction in undergraduate science and engineering. The National Academies Press.

  51. Nenzhelele, T. E., & Pellissier, R. (2014). Competitive intelligence implementation challenges of small and medium-sized enterprises. Mediterranean Journal of Social Sciences, 5(16).

  52. Newman, S. (2018). Philosophy and teacher education: A reinterpretation of Donald A. Schon’s epistemology of reflective practice. Routledge.

  53. Newmann, F. M., & Wehlage, G. G. (1993). Five standards of authentic instruction. Authentic Learning, 50(7), 8–12.

    Google Scholar 

  54. Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2017). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128–138.

    Article  Google Scholar 

  55. Proctor, R. W., & Van Zandt, T. (2018). Human factors in simple and complex systems. Taylor & Francis.

  56. Reidenberg, S., & Long, S. (2017). Negotiating the client-based capstone experience. International Journal of Teaching and Learning in Higher Education, 29(3), 580–588

    Google Scholar 

  57. Reimann, P. (2016). Connecting learning analytics with learning research: the role of design-based research. Learning: Research and Practice, 2(2), 130–142.

    Google Scholar 

  58. Ritter, B. A., Small, E. E., Mortimer, J. W., & Doll, J. L. (2017). Designing management curriculum for workplace readiness: developing students’ soft skills. Journal of Management Education, 42(1), 80–103.

    Article  Google Scholar 

  59. Siemens, G. (2013). Learning analytics: the emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400.

    Article  Google Scholar 

  60. Snape, P. M. (2017). Enduring learning: integrating C21st soft skills through technology education. Design and Technology Education: An International Journal, 22(3), 1–13

    Google Scholar 

  61. Tasso, C., Aquino, E. D., & Robertson, R. W. (2017). Approach to higher education: workplace development in the 21st century. La Educación superior, El Desarrollo: Del Lugar, 6, 34–48.

    Google Scholar 

  62. Thorington Springer, J., Hatcher, J., Rust, M., & Powell, A. A. (2019). Enhancing the quality of high-impact practices through taxonomies. Assessment Update, 31(2), 8–14.

    Article  Google Scholar 

  63. Wilton, N. (2011). Do employability skills really matter in the UK graduate labour market? The case of business and management graduates. Work, Employment and Society, 25(1), 85–100.

    Article  Google Scholar 

  64. Wolff, B. R., & Booth, M. (2016). Bridging the gap: Creating a new approach for assuring 21st century employability skills. Change: The Magazine of Higher Learning, 49(6), 51–54.

    Article  Google Scholar 

  65. Wong, A. C. K. (2016). Considering reflection from the student perspective in higher education. SAGE Open.

  66. World Economic Forum. (2015). New vision for education: Unlocking the potential of technology. World Economic Forum.

  67. Yeo, R. K., & Marquardt, M. J. (2015). (re) interpreting action, learning, and experience: integrating action learning and experiential learning for HRD. Human Resource Development Quarterly, 26(1), 81–107.

    Article  Google Scholar 

Download references


This work is supported in part by funding from the National Science Foundation under award #1725941. However, any opinions, findings, conclusions, and/ or recommendations are those of the investigators and do not necessarily reflect the views of the Foundation.

Author information



Corresponding author

Correspondence to Nikki James.

Ethics declarations

Conflict of Interest

Authors James, N and Laufenberg P are listed on a patent related to Practera technology. Author James, N and Laufenberg P own stock in Intersective Pty Ltd. who owns Practera technology.

Ethical Statement

This article does not contain any studies with human participants or animals performed by any of the authors. Conflict of Interest: Authors James, N and Laufenberg P are listed on a patent related to Practera technology. Author James, N and Laufenberg P own stock in Intersective Pty Ltd. who owns Practera technology.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

James, N., Humez, A. & Laufenberg, P. Using Technology to Structure and Scaffold Real World Experiential Learning in Distance Education. TechTrends 64, 636–645 (2020).

Download citation


  • Experiential learning
  • Instructional design
  • Team based projects
  • Industry engagement
  • Higher education