Reflections on a massive open online life cycle assessment course
- First Online:
- Cite this article as:
- Masanet, E., Chang, Y., Yao, Y. et al. Int J Life Cycle Assess (2014) 19: 1901. doi:10.1007/s11367-014-0800-8
- 2.2k Downloads
This article summarizes student performance and survey data from a recent massive open online course (MOOC) on life cycle assessment (LCA). Its purpose is to shed light on student learning outcomes, challenges, and success factors, as well as on improvement opportunities for the MOOC and the role of online courses in LCA education in general.
Student survey data and course performance data were compiled, analyzed, and interpreted for 1257 students who completed a pre-course survey and 262 students who completed a post-course survey. Both surveys were designed to assess student learning outcomes, topical areas of difficulty, changing perceptions on the nature of LCA, and future plans after completing the MOOC.
Results and discussion
Results suggest that online courses can attract and motivate a large number of students and equip them with basic analytical skills to move on to more advanced LCA studies. However, results also highlight how MOOCs are not without structural limitations, especially related to mostly “locked in” content and the impracticality of directly supporting individual students, which can create challenges for teaching difficult topics and conveying important limitations of LCA in practice.
Online courses, and MOOCs in particular, may present an opportunity for the LCA community to efficiently recruit and train its next generations of LCA analysts and, in particular, those students who might not otherwise have an opportunity to take an LCA course. More surveys should be conducted by LCA instructors and researchers moving forward to enable scientific development and sharing of best practice teaching methods and materials.
KeywordsEducation Life cycle assessment Massive open online course Pedagogy
Life cycle assessment (LCA) is used to support many types of environmental decisions, across a broad range of disciplines, and within nearly every economic sector. Despite LCA’s widespread use, the literature on pedagogical approaches in LCA and the experiences of LCA instructors and students is decidedly sparse (Cooper and Fava 2000a, b; Evans et al. 2008; Lin et al. 2012). As LCA practice grows, there is a pressing need for more research on the efficacy of different LCA teaching methods with respect to specific learning objectives and student characteristics. Such research could help the LCA community better educate and inspire its next generations of LCA analysts by facilitating scientific development and sharing of best practice teaching methods and materials.
This article takes a small step toward closing this research gap by summarizing student performance and survey data from a recent massive open online course (MOOC) on LCA (Masanet 2014). The course was designed to provide a basic yet broad overview of the LCA methodology to online students with elementary quantitative skills and from any discipline. Its primary aims were to introduce LCA to students who might not otherwise have access to an LCA course, to provide them with hands-on LCA experience, to inspire them to pursue more advanced LCA studies, and to equip them with a basic analytical foundation with which to do so.
Core topics included mass and energy balancing, unit process inventory construction and scaling, goal and scope definition, life cycle inventory (LCI) compilation, life cycle impact assessment (LCIA) (using TRACI 2.0 (Bare 2011)), cutoff criteria and multi-functionality approaches, and interpretation and reporting. These topics were taught with references to major LCA standards and guidelines—including the ISO 14040 series and the International Reference Life Cycle Data System (ILCD) Handbook (European Commission 2010)—to reinforce best practices. The 9-week course was hosted on the Coursera platform and ran from January to March 2014 (Coursera 2014). The MOOC consisted of 27 lecture videos (three per week), nine weekly homework assignments, in-video quizzes, a course project, and interactions between instructors and students via online discussion forums. The course project allowed for hands-on application of LCA to a simple and familiar product: a bottled soft drink. Students were required to build and interpret a complete LCA model of a bottled soft drink in a spreadsheet, using LCI data provided by the course instructors, and in a step-by-step fashion aligning with each week’s topical area and homework results.
Over 17,000 students enrolled in the course. Of these, over 1200 viewed all lecture videos, over 2300 turned in one or more homework assignments, and over 700 ultimately passed the course. Such low retention and completion rates are typical of most MOOCs, in which many students enroll but far fewer engage in the course materials (Ho et al. 2014). Nearly 2500 students completed a voluntary pre-course survey, which gathered data on their backgrounds, skills, motivations, and expectations entering the course. Findings from the pre-course survey are summarized in (Masanet and Chang 2014, in press). Of those who finished the course, 262 completed a voluntary post-course survey, which gathered data on their course experiences and future plans. These two voluntary surveys were not designed for rigorous statistical analysis, but rather to identify broad characteristics among the student body. This article combines pre- and post-course survey data with course performance data to shed light on student perspectives, challenges, and success factors, as well as improvement opportunities for the MOOC.
2 Retention and performance
One explanation for these performance gaps might be that the combination of a fast-paced course (9 weeks), weekly quantitative homework assignments, and building a comprehensive spreadsheet model proved too demanding for those with less confidence in their quantitative skills. So as not to discourage promising students from improving their quantitative skills to learn LCA, in future offerings, the MOOC’s duration will be extended and it will also offer additional resources for practicing and improving key quantitative skills along the way. Additionally, of all quantitative skills, the performance gap between experienced and non-experienced students was most pronounced for mass and energy balancing, which was also the quantitative skill for which the fewest students indicated at least some experience. These results suggest there is a particularly strong need for improved explanations and training in the MOOC with respect to conducting mass and energy balances.
The data in Fig. 2 also give evidence to some structural challenges of MOOCs in general. First, by nature, a MOOC’s content is largely “locked in” prior to the launch of the course, which makes it difficult to adapt content based on feedback from struggling students along the way as can be done in a traditional classroom. For example, only one brief lecture was devoted to mass and energy balancing, which was probably not sufficient given the lack of experience among students who ultimately enrolled in the course. Second, it is impractical for course staff to answer all questions in MOOC discussion forums, given thousands of enrolled students. Therefore, many less experienced students may not have received the support they needed to grasp key concepts as they would normally get in a traditional classroom. Third, although the MOOC listed clear prerequisites for basic quantitative skills, there is no mechanism for enforcement (i.e., anyone can enroll). Thus, it is likely that some enrolled students simply did not have the proper quantitative foundations to succeed in the MOOC, despite the explicit prerequisites. Moving forward, a key challenge is to find balance between offering remedial quantitative training in the MOOC to equip and encourage capable students to pursue more advanced LCA studies and maintaining a minimum level of quantitative rigor that is required for sound and credible LCAs.
3 Topical difficulty ratings
The distributions within each topical area shed light on overall perceived difficulty. Goal definition was the only topic that was predominantly rated as easy, while the topics of multi-functionality, input-output (IO) LCI, LCIA, and preparing results for ease of interpretation were predominantly seen as more difficult. Distributions were nearly evenly balanced between easy and difficult ratings for the remaining topics (e.g., functional unit and reference flow selection).
Although teaching methods can always be improved for any topical area, the data in Fig. 3 suggest that such improvements should be prioritized for materials and methods related to the “difficult” topics in future offerings of the MOOC. While many LCA analysts would agree that multi-functionality, IO LCI, LCIA, and effective results preparation are complex and nuanced topics in practice, insufficient data exist in the literature to determine whether these topics are universally perceived as most difficult in LCA courses, or if these perceived difficulties can be attributed to the specific teaching methods and materials in the MOOC. Regardless, the MOOC’s structural challenges of limited time, limited direct engagement with many students, and mostly “locked in” course content likely contributed to greater perceived difficulties for these complex and nuanced topics. For example, due to time constraints, the MOOC focused mostly on applying LCIA characterization factors and IO LCI results with only limited coverage of the scientific and mathematical underpinnings of the methods. Although students were referred to additional resources for more in-depth learning, this approach may have contributed to an incomplete grasp of these topics for many students and, subsequently, less confidence when applying these methods in their course projects. Perceived difficulties in preparing results for ease of interpretation may be an artifact of the course design, given that students had to construct result tables and graphs manually in their spreadsheet models (as opposed to generating them automatically using a commercial LCA software package). However, this element of the MOOC design was deliberate to encourage careful attention to—and continuous scrutiny of—modeling results and to develop skills in presenting data in ways that maximize utility to an LCA’s audience. The MOOC also devoted significant time to the topic of multi-functionality, including two lectures and two homework assignments that followed the ISO multi-functionality hierarchy. While multi-functionality can be challenging for even seasoned LCA analysts, the difficulty ratings in Fig. 3 suggest that even more coverage, hands-on exercises, and practice within the course project related to multi-functionality should be incorporated into future offerings of the MOOC.
4 Shifts in perspectives?
The cohort for Fig. 4 consists of the 195 students who passed the course and completed both the pre-course and post-course surveys. Comparing the distributions of responses before and after the course sheds light on how overall student opinions changed on each broad statement. For example, there was a significant shift toward much stronger disagreement with the statement “LCAs are easy to conduct” after the course. This shift suggests that the course met its objective to demonstrate that sound and credible LCAs require lots of time, resources, data, documentation, and attention to detail. However, the shifts in Fig. 4 also suggest that the MOOC was less effective at convincing students that data are often scarce; that LCAs can be subjective; that results are typically uncertain; and that scientific validity depends heavily on chosen methods, data, and study design. These are broad impressions that seasoned LCA analysts have learned and reinforced through experience, which is a difficult process to replicate in a 9-week introductory MOOC. While the MOOC was designed to convey these limitations implicitly, most lectures and exercises were focused on core LCA methods without explicit and continuous reinforcement of such limitations over the duration of the course. For example, to demonstrate that LCAs can be subjective and based on flawed data and assumptions, students were asked to evaluate critiques of a debunked life cycle energy study of a Prius versus a Hummer (Gleick 2007; Hauenstein and Schewel 2007; Spinella 2007) and assess where the study fell short with respect to specific LCA best practices. However, as the final homework assignment, this exercise may have missed an opportunity to more strongly reinforce LCA’s limitations by teaching such limitations in parallel to the methods. In a traditional classroom, instructors can ensure such limitations are recognized through ad hoc discussions with students, which is not easily achieved in an online discussion forum. The data in Fig. 4 suggest that in a MOOC, such limitations should be made very explicit in the lectures and course materials with steady reinforcement throughout the course.
To address these shortcomings, future improvements to the MOOC may include requiring students to gather some of their own LCI data to demonstrate data scarcity, evaluate differences in subjective methodological decisions in published LCA studies, conduct uncertainty analyses in their course projects, identify examples of false precision in the LCA literature, and critique additional studies to identify deviations from best practices. These changes will necessarily lengthen the duration of the MOOC, but with the benefit of leaving students with more nuanced views of important LCA limitations before they move on to more advanced studies. Despite these shortcomings, the MOOC appeared to be quite successful in fostering constructive skepticism among the passing students. In the post-course survey, when asked if they would approach future environmental data and claims with more or less skepticism and scrutiny compared to when they started the course, 88 % of the cohort in Fig. 4 indicated they would be more skeptical and scrupulous moving forward.
5 What is next?
The majority of passing students indicated a stronger desire to improve their quantitative skills, employ LCA in their personal and professional decisions, and continue on to more advanced LCA studies after taking the MOOC. Therefore, while there are certainly limitations to the MOOC in its original form as discussed in previous sections, it seemed to succeed in motivating students to further develop their LCA skills.
These results suggest there might be a valuable role for MOOCs in recruiting and motivating students to join the field, after which they can move on to more advanced LCA studies. Furthermore, passing students indicated a strong desire for additional MOOCs on advanced LCA topics, which presents an opportunity for the LCA community to develop new online courses. When asked if they would enroll in a follow-on MOOC covering more advanced LCA topics, 87 % responded “yes.” Interestingly, while some students expressed increased desire to pursue a career as an LCA analyst, an equal number of students expressed less desire to do so after completing the MOOC. While the reasons for this outcome are not clear, the other responses indicate strong desires to utilize LCA methods and results moving forward, even if some of those students are no longer interested in conducting LCAs as a career choice. Therefore, it appears that the MOOC was successful in fostering enthusiasm and capabilities among the passing students for using LCA greater use in practice and in whatever ways make sense for their careers and core disciplines.
The MOOC experience described here suggests that online courses can play a key role in attracting and motivating a large number of students and in equipping them with basic analytical skills to move on to more advanced LCA studies. However, MOOCs are not without structural limitations, especially related to mostly “locked in” content and the impracticality of directly supporting individual students, which in this cased created challenges for teaching difficult topics and conveying important limitations of LCA in practice. While future improvements will be made to address these shortcomings, in its first run, the MOOC seems to have met its primary aims of reaching and motivating many prospective LCA practitioners. Moreover, despite its low retention rates, the MOOC also succeeded in effectively teaching LCA basics to over 700 passing students and at their own convenience, which is equivalent to conducting 20 traditional in-person courses (with 35 students each). Therefore, MOOCs may also offer the LCA community with a highly efficient means of attracting and training its next generations of LCA analysts. Surveys like the ones summarized here are critical for shedding light on the strengths and weaknesses of different pedagogical approaches, including this MOOC. Such surveys should be conducted by more LCA instructors and researchers to foster greater knowledge sharing moving forward, especially given the growing demand for LCA practitioners in many economic sectors. The survey data collected during this MOOC suggest that there is a significant opportunity for more online courses on advanced LCA topics. How online courses contribute to the advancement of LCA pedagogy and educational opportunities remains to be seen, since MOOCs are a recent phenomenon. However, the broad reach and reasonable success of this MOOC suggest that there is both a valuable role and a strong demand for such courses moving forward.
Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.