1 Introduction

Serious gaming is gaining momentum in an educational context as an interactive and motivational approach to learning (Jabbar and Felicia 2015; Ke et al. 2016; Boyle et al. 2016). Because of these advantages, and in comparison to traditional educational delivery methods, serious games are being increasingly used for training in governmental, military, health, and educational arenas (Zyda 2005; Boyle et al. 2016).

Serious games have been used for educating and training healthcare students and professionals, and it can be claimed they have positive effects on knowledge or skill improvement (Blakely et al. 2009; Akl et al. 2010; Graafland et al. 2012; Akl et al. 2013; Wang et al. 2016). In addition, serious games can provide opportunities for healthcare students to practice in safe environments (Blakely et al. 2009). Furthermore, Kron et al. (2010) and Lynch-Sauer et al. (2011) found that medical and nursing students respectively had positive attitudes towards computer games. Although serious games have not been widely used in dental education, they have been used in several subjects. For example, there are serious games for pre-clinical subjects such as dentine bonding (Amer et al. 2011) and alginate mixing (Hannig et al. 2013), and positive impact on learners were found in these dental serious games.

Dental public health is a field of dentistry, defined as “the science and art of preventing oral disease, promoting oral health and the quality of life through the organized efforts and informed choices of the society, organisations, public and private, communities and individuals” (Gallagher 2005). Therefore, it was considered important and helpful to apply serious gaming to dental public health education, where students can gain relevant experiences in online learning situations. The aims of this study were to investigate the use of a serious game for dental public health education by analysing a key feature of the game, the log function of the gaming activities of play-learners (dental students), and to explore student perspectives towards the game.

The paper will firstly describe the public health game GRAPHIC. The methods of this study will then be presented, followed by findings, discussion, and finally a summary of this study.

2 GRAPHIC-II

An intervention using the first iteration of the GRAPHIC (Games Research Applied to Public Health with Innovative Collaboration) Game or GRAPHIC-I was reported by O’Neill et al. (2012) as having potential for dental public health education following its first pilot iteration with undergraduate dental students. The learning outcome of GRAPHIC was to enhance dental student understanding of how to design health promotion programmes for a given population—in this case primary school children (O’Neill et al. 2012). In other words, by completing the game, students should be able to:

  1. 1.

    identify the oral health needs and demands of a given population;

  2. 2.

    recommend a series of possible actions in support of improving oral health;

  3. 3.

    agree on a plan of action; and

  4. 4.

    outline a monitoring and evaluation strategy.

To complete the game, students were firstly required to consider information on a virtual town, which constituted a learning scenario for the game. A variety of health promotion interventions were provided as possible options (Fig. 1), including ‘Restricting sugars intake’, ‘School lunchbox policy’, and ‘Healthy schools programme’. From these choices, students needed to use an evidence-based approach, by (a) reading recommended literature relating to each intervention, (b) considering if the evidence was strong enough, and then (c) evaluating whether an intervention was relevant to the learning scenario or not. After that, students were required to submit the five best options for health promotion interventions provided in the game.

Fig. 1
figure 1

A screenshot from GRAPHIC-II showing some examples of health promotion interventions with a function to add a selected intervention

Evaluation data and feedback from staff and students on GRAPHIC-I revealed that there was much room for an improved gaming interface, also including functionality, suggesting easier navigation and selection of health promotion initatives (Sipiyaruk 2013). GRAPHIC-II was devised as a new improved online serious game for the next cohort of dental students. It was developed as a stand-alone web-based game and, in this iteration, students had to achieve a score of 100% through a reasoned, evidence-based approach. Although the rules to complete the game were similar to the previous version, there was an improved design:

  1. 1.

    The user interface was improved visually by having the colourful graphical display and functionally to increase the speed and ease of selecting possible intervention thus supporting learner progress through the game. This involved adding ‘GAME PROGRESS and ‘ACTION PLAN’ boxes (Fig. 2).

  2. 2.

    Moreover, instead of a virtual community, this version used information from an African country as a learning scenario.

  3. 3.

    Additionally, there was incorporation of an analytics function for the new version of the web-based game to log all gaming activity, including submitted answers, as part of the game engine. Therefore, this function could be used to reveal strategies employed by students to complete the game.

Fig. 2
figure 2

A screenshot from the game showing how students can view their progress and a box to present what interventions have been selected to their ‘action plan’

3 Methods

GRAPHIC-II was made available as an in-course requirement for one academic year (2013–14) of dental undergraduate students’ dental public health course at King’s College London Dental Institute. The learning outcomes were the same as in GRAPHIC-I; students were required to provide a set of the best five options to promote dental health in a given population scenario. The game system was first tested technically by the development team and by academic dental public health staff at KCL. Afterwards, the scientific content of the game and scoring system were reviewed and validated by the academic team.

To complete the game, students had to obtain a score of 100% and were permitted unlimited attempts to submit an answer. Rather than learners interacting with the game as a group, it was played by individuals as part of course requirements; thus providing the opportunity to measure individual activity through analytics. Students were first able to access the game on-campus where they were given staff support, and subsequently use and complete the game off-campus. Students were divided into two groups and used the game on successive weeks. They were required to firstly perform the game in class to build familiarity, and were then permitted to complete the game in their own time. Afterwards, students were required to complete a reflective assignment relating to their chosen health promotion initiatives.

The analytical data were automatically collected by the logging system of the game; this included when and how long each student logged into the game, how many times each student submitted their answers, and what answers they submitted. The anonymised data were exported in an Excel file for analysis. Descriptive analysis was performed to summarise and describe the basic features of the data, and independent t test was used to compare a different feature between the two groups. Additionally, students were encouraged to provide feedback and this was analysed thematically using a framework analysis (Spencer et al. 2013).

Ethical approval for this project was granted by the Biomedical Sciences, Dentistry, Medicine and Natural & Mathematical Sciences Research Ethics Subcommittee (BDM RESC), King’s College London (KCL) College Research Ethics Committees (CREC), application number BDM/13/14-117 on 15th July 2014. This set of anonymised data were analysed using SPSS Version 22.

4 Results

163 dental students from King’s College London were assigned to complete GRAPHIC-II. Five students were not available to complete the game on time and were excluded from the analysis and thus 79 students played GRAPHIC-II each week. The data from these 158 students are reported in this study, which included the number submission attempts, grouping of answers (list of health promotion interventions), and the time taken by students to achieve the required score of 100%.

Amongst the 158 students who completed the game, the average number of attempts to achieve a score of 100% was 13.6 (range 1–126, SD = 16.31). The average for group 1 students was 18.3 (range 1–57, SD = 14.45), whilst that for group 2 was 8.9 (range 1–126, SD = 16.80) (p < 0.001), as presented in Table 1. There were 23 students from group 2 (and only 4 students from group 1) who achieved 100% with the first attempt.

Table 1 The average times of attempts that students used to submit their answers

Regarding the best five options, students had to choose from the provided list of health promotion programmes; 11 patterns were used overall by this year group. Amongst the class, there were 27 students who completed the game by using only one attempt to submit their answers; these one-attempt submission students submitted four different patterns of answers. The speed of submission from logging onto the game varied widely from 3 min to 2 days (Median = 7 min, Mode = 7 min, Mean = 132.3 min, SD = 647.09), with evidence that students took varying times to submit one accurate response.

Exploring the behaviour of students who submitted their answers with a high number of attempts, 18 students took more than 30 attempts to complete the game, in varying patterns of time. The three highest numbers of submission attempts were 126, 74, and 57, as presented in Table 2. In term of submission rate (attempt/minute), amongst these students, the three highest submission rates were 2.74, 2.47, and 2.11 attempts/minute (Table 3).

Table 2 The five highest numbers of submission attempts and their rates
Table 3 The five highest submission rates

Student feedback was largely positive, focusing on the navigation of the game, which was considered user friendly and visually intuitive. Examples of this type of feedback were:

… it’s nice and colourful and eye-catching.

Easy to use, interactive and a truly worth-while task …

Further suggested developments included receiving feedback informing the learners why a set of answers they selected was incorrect and having a more relevant learning scenario to their current UK context. As learners commented:

… perhaps giving feedback on the reasons as to why the incorrect options are regarded as being incorrect.

Using a more relevant example …

5 Discussion

5.1 Log Data as Indirect Observation

The data in this study were collected automatically by the log engine of the game. This data collection technique can be considered as a strength of serious games. Smith et al. (2015) claim that this technique represents an indirect observation, where the process of data collection does not distract players or learners, compared with direct observation. In addition, this technique allows a researcher to collect data remotely, as seen in GRAPHIC-II. The analysis of log data can be considered as stealth assessment, allowing the researcher to assess how users interact with the game tasks (Snow et al. 2015). According to GRAPHIC-II, the log data included student identifiers, when and how long they logged into the game, how many times they submitted their answers, and what answers they submitted.

5.2 Game Completion with Low Submissions

From the log data of GRAPHIC-II, it can be concluded that group 2 students used fewer submission attempts, compared with group 1 (p < 0.001). This suggested that the students in group 2 performed better in completing the game and achieving the score of 100%, compared with group 1 counterparts. Given the game design, however, completing the game using one-attempt submission was not expected. These students are intellectually able and thus some students were able to achieve the correct result quickly at their first attempt whilst others may have taken care over their response. It is possible that the group 1 students might have discussed their learning with peers which could explain why more in group 2 submitted five correct answers at the first attempt. The issue of collusion has been noted as relevant for online assessment, especially when all students do not have to complete tasks at the same time (Rowe 2004), which is common with formative assessments. However, the fact that students used 11 different ways to achieve the final score of 100% meant there was not just one right answer; this reflects reality where different groupings of evidence-based health promotion initiatives may be used to promote oral health, and reducing the need for collusion. Furthermore, students were permitted to complete the game at their leisure and some chose to do so; however, an important issue to explore is that students may not gain the requisite learning through the game if they complete the game at the first attempt and do so quickly.

It can be suggested that the level of difficulty of the game was suitable for most students; however, incorporating more learning scenarios with a higher level of difficulty for students who complete the assignment quickly may be a helpful way of improving learning. It would also be more advantageous if student game activities are scheduled for all students to perform and complete the game at the same time; however, timetabling challenges mean that all students are not always available on the same day and it is difficult to have access to sufficient computers if all students in the class perform the game at the same time. In future, as the personal computing capabilities of student increase, it should become easier to run large classes.

Encouraging autonomous and self-directed learning is considered beneficial (Grow 1991). Feedback or clues can also be considered as a solution, so students can improve their knowledge using feedback in the game, rather than relying on other students. Also, it seems that greater autonomy can be appropriate for dental students who will finally be working in their own practice when they have finished their degree in dentistry. Therefore, it would be ideal if students could realise the benefits of the game and complete it autonomously. Furthermore, pre- and post-knowledge tests can be considered as supporting tools. A pre-knowledge test can identify knowledge gap of students, and then students need to pay attention on the game completion in order to achieve a higher score from a post-knowledge test.

5.3 Game Completion with High Submissions

Another student strategy to be examined is making a random submission of answers until the successful completion of the game, possibly without reading the provided learning materials, i.e. following the game outcomes, in order to submit the appropriate list of health promotion programmes to achieve the score of 100%. As a rule, the students would need to read the evidence-based learning materials provided in GRAPHIC to select the best five answers for submission. This requires a period of time to consider which choices should be selected for the submission. However, according to our data, some students appeared to randomly submit their answers, as they spent less than a minute considering and/or submitting answers. Although these students completed the game with a great number of submission attempts, they might not achieve the game outcomes because they missed the process of thinking. To prevent this issue, the game should include a new function where the number of submission attempts would be limited to an appropriate number, thus placing less pressure to have the right answer and take more time to gain the final answer. This feature will also enhance the challenge of the game, as not all students will be able to achieve a score of 100%.

5.4 The Role of Failure

To understand why the students who had these two unexpected strategies might not achieve learning outcomes, compared to other students who have learnt from the game, it is necessary to explain the role of ‘failure’. This is an underpinning game theory, suggesting that ‘failure’ in games is not generally considered as ‘failure’, because players can consider their failure in order to re-evaluate or adjust their strategies to achieve goals of games (Gee 2008; Klopfer et al. 2009). This can be considered as a learning process when using a serious game. In other words, within GRAPHIC, students firstly may not be able to get the score of 100%; from feedback, they need to re-evaluate their submitted set of health promotion interventions and reconsider other options when submitting an answer in order to get a better score until completion of the game.

5.5 Benefits and Limitations of the Log Data Analytics

The log activity function of GRAPHIC-II can be considered a valuable feature of serious games. Learning analytics can be applied to identify students who are stuck with their learning (Serrano-Laguna et al. 2014). In this study, this feature identified strategies or gaming behaviour (e.g. random answers) that students used to complete the game, which might hinder some from achieving the desired learning outcomes. It can then help identify students who would benefit from support. However, within GRAPHIC-II, there is a limitation of merely using the analytics of the log data which must be recognised. Some students may have fully understood the underpinning theoretical framework they had been taught before encountering GRAPHIC. Therefore, they could more easily obtain correct answers at the first attempt and thus achieve successful learning outcomes. Consequently, further evaluation to check student’s understanding (e.g. an interview, a reflective assignment, or pre- and post-knowledge tests) is required to confirm whether students have achieved the learning outcomes. For example, as mentioned in methods section, this cohort of students needed to complete a reflective assignment after game completion, this would examine whether or not they understood reasons of answers they selected.

5.6 Further Development of GRAPHIC

According to the user feedback, the findings suggest that GRAPHIC-II was much improved from GRAPHIC-I, especially in relation to game navigation. This can be claimed as a major improvement, as a problematic navigation can distract students from their learning. In addition, the new interface in GRAPHIC seemed to be more engaging for students. However, further developments were suggested such as user feedback, relevance of the learning scenario to their context, and having access to additional learning scenarios with higher levels of difficulty. These will enhance a quality of learning within GRAPHIC.

6 Conclusion

GRAPHIC-II offers a gaming environment where students can practice critical thinking and decision-making skills regarding health promotion in a safe environment. In GRAPHIC-II, users were required to consider the provided learning scenario, evaluate the given list of health promotion programmes, and subsequently decide what the appropriate answers were. Therefore, students were able to gain experience in a safe environment. In addition, the activity logging system of serious games can be a valuable feature in supporting academic staff when evaluating whether the students acquire critical thinking and decision-making skills in dental public health. This feature can be considered as a stealth assessment, which allows academic staff to observe how students complete the game, without interrupting their activities. By using this technique, academic staff can identify students who may be stuck with a learning process in order to help them achieve the desired learning outcomes. However, multiple techniques, including pre- and post-knowledge tests or a reflective assignment, are required to confirm whether the learning outcomes are achieved through the game. Overall, GRAPHIC-II provided a greater contribution to dental public health education, compared with GRAPHIC-I; however, further improvements are required to enhance its quality.