Background

Collecting information on food and dietary intake provides valuable insights into the associations between diet and health, and helps to evaluate the impact of intervention programmes. Thus, high quality dietary assessment instruments are needed with high validity and reliability for all ages [1, 2]. Measuring adolescents’ dietary intake is still challenging and many sources of measurement error are reported [3]. Adolescents may be less interested, less motivated and less cooperative compared to other age groups with regard to reporting diet [3, 4]. Preliminary studies among adolescents suggest that the innovative use of new-technology may improve the accuracy of adolescents’ dietary information [5, 6].

Online dietary recall systems have been developed and tested in a number of countries based on their national food database and language, such as the Automated Self-Administered 24 h recall (ASA24) [7] and DietDay [8] in the US, and Web DASC in Denmark [9]. Furthermore, Young Adolescents’ Nutrition Assessment on Computer (YANA-C) was originally developed to collect dietary data among Belgian–Flemish adolescents, and another adapted version was developed to be used online: Children and Adolescents’ Nutrition Assessment and Advice on the Web (CANAA-W) [10]. There are also some tools available in the UK; for example, the Synchronized Nutrition and Activity Program (SNAPTM) for 7 to 15 year old children [11] and INTAKE24 for 11 to 24 year olds [12]. However, there is no British online 24 h dietary recall tool suitable for the whole population.

myfood24 (Measure Your Food on One Day) is a new and innovative self-administered online 24 h dietary recall/record tool designed to make the collection of multiple days’ worth of automated self-administered 24 h recall/records feasible among a wide variety of groups and settings in large scale epidemiological studies. The development of myfood24 was based on focus group evaluation of existing tools for different age groups, including adolescents [13], adults and older adults, thereby incorporating a wide age range of the British population [2]. Lessons were also learnt from the design of ‘My Meal Mate’ (MMM), a Smartphone app for weight loss [14] designed by the same research team.

Most of the available dietary assessment websites vary based on their features (website-functions and usability), food database and dietary assessment methods used (recall or record). myfood24 has been designed to balance the need for researchers to collect detailed dietary information and users’ desire for a quick and easy tool. Thus, to reduce the completion time, myfood24 implements some aspects of the Automated Multiple Pass Method (AMPM) [15], with an optional quick list as the first pass; a detailed food search; forgotten items; prompts for a limited number of foods; and final review before submission. The food database in myfood24 is unique and based on several food composition data sources (~3500 from British food composition tables; ~ 33,000 manufacturers’ items; 700 fast foods and ~4,600 supermarket items) to provide approximately 45,000 food items in the tool, with the potential to be updated. Moreover, one of the advantages of using myfood24 is that researchers can select between recall or diary option [16, 17].

Using new technology in health education and dietary assessment methods is in its infancy, therefore information regarding the process of developing and testing such tools is limited [18]. Successful web-based dietary assessment software needs to be intuitive, simple and engaging for users [2]. In order to identify these components, usability-testing is rated as the most effective method for creating greater strategic impact and enhancing the final product [19, 20].

The terms of usability and acceptability are usually used interchangeably in many different ways, as there are no absolute definitions for them. Usability is the overall technical term of the user experience, user friendliness and ease of use [21]. Usability means how well the users can use the system functions in terms of: easy to use and learn, efficient to use, easy to remember, few errors and pleasant to use [22]. Whereas acceptability can be defined as user’s willingness within a special target group to employ the tool for the tasks it is designed to support [23]. Usability and acceptability are the two leading criteria for successful design [21]. Usability-testing should be carried out in different stages, in the beta-version of the tool (the final version of the development process and before final amendment and public release) and after making changes to evaluate the usefulness of the modifications and the overall acceptability of the tool [2427].

Although usability-testing has been used in various developed e-health software, such as smoking websites [28] and online physical activity tools [29], there is limited information published on the usability-testing of online dietary assessment websites [30]. Therefore, this study aims to identify usability and design issues associated with the completion of myfood24 (beta-version) and to determine tool acceptability among adolescents (11–18y) before and after making improvements (beta & live-version).

Methods

myfood24 features

myfood24 contains two main areas: the researchers area where researchers can customise the website to fit their study design by adding project related text and logo, personalized additional help, a tailored invitation, send reminder emails to participants, select recall or diary option and select whether to display nutrient summary for participants or not. In the participants’ area, participants can select food items from the search bar or create a recipe in the ‘recipe builder’. Users can filter food items by selecting any of the filter options: ‘recently used items’, ‘recipes’ that have been entered previously and ‘food group’. For instance, if a search for ‘apple’ is made, only categories containing that item will be displayed, such as apple (fruit), apple pie (dessert), apple juice (drink), or by brand.

After the food item is chosen, food portion size (FPS) options are given as a dynamic extension of the search result screen to enable a seamless (all-in-one) user experience. There are various options presented to cover different food types and to maximize participants’ abilities to determine their portion size. myfood24 has images for 100 different food types (most frequently consumed foods), each food has 7 portion sizes to select from. To maximum image coverage, images are assigned to all food items that look similar. Food photos were obtained from the Young Person’s Food Atlas Secondary [31]. Moreover, participants can alternatively enter a specific weight in gram/ml if they know the exact portion size or use standard pack size.

The selected food item and its portion size are added to the meal tracker display area (breakfast, lunch, evening meal, snacks and drinks). To enhance the completeness of reporting, a pop-up message appears on the screen after selection of some common foods (e.g. bread, cereal) and probing for food items (e.g. butter or margarine; milk) that are often eaten in combination. Moreover, if participants forget to add the portion size after entering the food item, the software is programmed to prompt the participant to check the entry. Before submitting the dietary data the review screen prompts participants to check their entries and answer additional questions regarding supplement intake and whether or not their food record is typical for a regular day. After submitting the diary it can no longer be edited by users. A ‘thank you’ message with an optional summary of energy and eight nutrient intakes (energy, macronutrients, saturated-fat, fibre, sugar and salt) is displayed. For enquiries relating myfood24 use in research, please visit www.myfood24.org.

Usability-testing techniques

Figure 1 illustrates the study design which consists of two stages: stage-I was conducted in the beta-version of myfood24 and stage-II was conducted after making the amendments in the live-version of myfood24.

Fig. 1
figure 1

myfood24 usability and acceptability study design

Participants

It has been reported that 80 % of usability problems are uncovered with inclusion of five participants and 90 % with ten participants with each additional participant contributing fewer new problems [19, 32]. Usability-testing of the beta-version therefore requires approximately 10 to 20 participants to enable the vast majority of usability issues to be identified [33]. In stage-II, more participants were recruited to test the acceptability and feasibility of using myfood24 (live-version) in a larger sample to ensure a representative sample of adolescents from each age group and gender. Adolescents aged 11–18 years old were recruited from four different secondary/high schools from different areas in Leeds. Participants who did not speak English as their first language were excluded, as myfood24 uses only the English language. Experience in using a computer was not required. Written consent was obtained from all participants and parental consent was obtained for adolescents who were younger than 16 years old. The study was reviewed and received approval from the University of Leeds Research Ethics Committee (MEEC 11–046).

Procedure

Stage-I: myfood24 (beta-version)

The researcher informed the participants in all stages that the study was not meant to test their ability to use the computer, rather it was to test the quality and attributes of myfood24. There were two different groups. In group-1, the researcher identified a list of key tasks that all users should be able to perform on myfood24. These pre-defined tasks were designed in a scenario to test specific features of myfood24; for example, using the ‘make a list’ function, entering food recipes and selecting different portion size options, as well as correcting mistakes (Additional file 1). The standardised tasks contained a variety of foods from the most commonly consumed foods by adolescents [34]. The test was carried out in a meeting room at the University of Leeds and lasted around 60 min for each participant.

Screen capture software (Camtasia Studio version 8 (Techsmith, USA)) was used to record participants’ screens and verbal recordings whilst undertaking the user tests of myfood24. During the session, users were instructed to speak out loud about positive and negative experiences as they performed each task, and they were encouraged to complete the tasks by themselves. The researcher observed the users indirectly (because some participants may have felt uncomfortable) and reported the users’ behaviour in light of the task analysis criteria (Additional file 1). At the end, the participants filled in a usability-acceptability questionnaire, and then received £5.00 remuneration.

In group-2, remote usability-testing [35] (participants completing the test at home) was carried out to obtain a clear indication of how myfood24 would perform in a real life situation and to test the availability of different food items in the software. Participants were un-moderated and asked to complete one 24 h dietary recall using myfood24. Then complete the usability-acceptability questionnaire that was used with group-1. Users were also asked to provide written comments on any problems they faced.

Stage-II: myfood24 (live-version)

Based on users’ feedback from usability-testing in stage-I, amendments were made to develop the final version of myfood24 (live-version). Seventy adolescents were recruited and divided into 14 groups, each group containing five-participants. This was based on logistical reasons in order to manage the research in schools. In each group, 3 participants were assigned to start with myfood24 and 2 were assigned with the interviewer (MPR) to reduce bias due to a learning process in the second set of responses. They were asked to report their 24 h dietary recall in myfood24 (without any assistance from the researcher) and then attend an interviewer-administered 24 h recall on the same day for two non-consecutive days at school. After the second use of myfood24 participants were asked to complete the usability-acceptability questionnaire.

Usability-acceptability questionnaire

The questionnaire consisted of three parts. The first part covering demographic information and questions regarding participants’ self-defined attitudes towards new technology. The second part contained three open-ended questions to comment on myfood24 (my favourite thing about myfood24 was…, my least favourite thing… and which particular areas users think we need to address in detail within the ‘Help’ section…). Finally, participants answered questions regarding myfoof24’s acceptability and satisfaction. This contained eight statements with a five-point Likert scale (1 strongly-disagree to 5 strongly-agree), in addition to the System Usability Scale (SUS) (Additional file 1) [36, 37]. SUS is a validated, reliable and free tool to use (34). There were two questions added to the previous questionnaire in stage-II. To identify adolescents’ opinions about myfood24 compared to the interviewer-administered 24 h recall, they were asked to rank the ease of undertaking each method using a five-point Likert scale (1 very-easy to 5 very-difficult), choose their preferred method and give the reason why.

Data analysis

In stage-I, qualitative data from the transcribed verbatim of the screen record for each participant and the researcher’s observational notes (group-1), as well as users’ comments regarding issues they experienced while using myfood24 (group-2), were analysed in accordance with the principles of thematic analysis [38]. The different codes were sorted into potential themes and all the coded data extracts were gathered within these themes. The themes covered the key areas of myfood24 and are presented in Table 2. All findings from group-1 and group-2 were combined to report the full range of usability issues and users’ recommendations to improve the tool. Findings from the open-ended questions in the questionnaire were analyzed using thematic analysis [38] and some typical quotes were selected to represent different views of users’ acceptability.

In order to reduce the risk of bias, all qualitative data were coded by two researchers and conflicts in coding decisions were reviewed by all researchers and resolved by consensus. Furthermore, the overall completion time in myfood24 was calculated in both stages. The overall SUS for each respondent was calculated for both stages. SUS is a 10 part statement that consists of a 5-point Likert scale for each part. For statements 1, 3, 5, 7 and 9 the contribution for each is the scale position minus 1. For statements 2, 4, 6, 8 and 10 the contribution for each is 5 minus the scale position. The sum of the scores is then multiplied by 2.5 to obtain the overall value. The overall SUS score ranged from 0 (negative-views) to 100 (positive-views) [37]. Moreover, findings from the eight statements of the Likert scale regarding myfood24 acceptability and user satisfaction were calculated before and after making the improvements and the overall median and interquartile range (IQR) were calculated. A Mann–Whitney U-test (two-sample) was used to test the rank differences in users’ perceptions between myfood24’s beta-version and live-version. Unpaired t test was used to test the improvement in the SUS in the 1st and 2nd stage. Analyses were performed using the STATA statistical software release 11 (Stata Corporation) and the significance level was set at 0.05.

Results

Characteristics of participants

In total, eighty four participants took part in this study. Table 1 illustrates participants’ general characteristics at each stage. Most of the participants were of white ethnicity and their ages ranged from 11 to 18 years old. All users in stage-I and 69 (99 %) in stage-II had internet at home and 12 (86 %) and 59 (84 %), respectively, accessed the internet daily. On a scale of 1 to 10, 7 (50 %) of the participants gave themselves 9/10, 5 (36 %) gave themselves 10/10, and the rest rated themselves 8/10 in terms of their confidence in using technology. Similarly, in stage-II participants ranked themselves in the last three highest scales regarding confidence in using technology, 26 (37 %) gave themselves 10/10, and 20 (29 %) gave themselves 9/10.

Table 1 Sample characteristics by study stage, completion time and system usability scale (SUS) of myfood24

Time completion

The mean completion time of myfood24 (beta-version) in stage-I was 31.8 (SD = 3) minutes when using standardised tasks (group-1), and 31.0 (SD = 9) minutes for participants in group-2. After making the amendments to the beta-version of myfood24 the completion time was reduced to 16.2 (SD = 5) minutes in the live-version.

Usability of myfood24 (beta-version)

The thematic analysis from stage-1 revealed a number of key issues that needed to be addressed to enhance the overall utility of myfood24 among adolescents. The usability issue and participants’ comments/recommendations to enhance the tool are illustrated in Table 2. All reported issues were feedback to the software developers to inform the final development of the live-version. Snapshots of myfood24 have been annotated to give an example of the usability issues encountered during the test (Fig. 2).

Table 2 Usability problem of the beta-version of myfood24 and adolescents’ comments and recommendation
Fig. 2
figure 2

Snapshot of food search bar and food portion size in myfood24 (beta-version). (Red text indicates areas identified for improvement)

From the open-ended questions, four themes emerged regarding the most favourite aspect of myfood24, which included design and layout, nutritional feedback, ‘easy to use’, and the availability of many options in the tool. Participant-6 said “finding out how my diet compared to the guide is very useful”. Participant-5 said “there are many options to enter the food diary and they were well integrated”. In contrast, many pop-up questions, technical issues and difficulties in selecting FPS were the least favourite aspects in myfood24. Participant-4 stated that “It took you a while to find the products”. Participant-2 indicated that “Choosing the FPS was not straight forward, as there are many options making it difficult to choose”. Participant-1 said “pop-up questions were unnecessary in certain foods and were slightly annoying”. Three themes were revealed regarding the areas adolescents thought should be addressed in detail in the ‘Help’ section. Two adolescents asked for more instructions to be added regarding how to use the ‘recipe builder’ and two mentioned adding a short video on how to enter cooked foods and select FPS. Most of them (10 (71 %)) stated that the tool was easy to use and that there was no specific area that needed to be addressed in the help functions. Participant-12 said “The website is easy to use and therefore doesn’t need any further instructions”.

Adjustments

A number of key changes were made to improve myfood24, including simplifying certain words, ensuring sufficient visual appeal for certain functions by changing the colour or font, and reducing the number of pop-ups for missed items to ensure a fast completion time. ‘Mouse hover’ was also added to help with the use of certain functions. Adding certain features to the food search, like a spell checker, auto-fill option, and presenting the results in alphabetical order, also took place (Table 2).

Acceptability of myfood24

Table 3 presents the findings from the 8 statements Likert scale before and after making the amendments in myfood24. With the beta-version, users assessed the time taken to complete myfood24 as being reasonable. They liked myfood24’s design and agreed that the terminology used was understandable (57 %, 72 %, 72 % respectively). Conversely, 50 % of adolescents disagreed with the simplicity of searching for food items in the database, and 57 and 50 % of them neither agreed nor disagreed with the simplicity of adding home recipes or selecting FPS respectively. However, there were significant improvements in adolescents’ perceptions regarding myfood24’s functions in the live-version compared to the beta-version, particularly in terms of time for completion, design and layout, searching food items, making a list function, selecting FPS and correcting mistakes. The average SUS score for the beta-version of myfood24 was 66/100 (95 % CI 60, 73) and this was increased to 74/100 (95 % CI 71, 77) in the live-version. There was a significant improvement between the mean SUS for the beta-version and live-version of myfood24 with a mean difference of 7.5 points (95 % CI: 0.2, 14.8; P < 0.04) with 80 % power to detect the changes.

Table 3 Users’ acceptability of myfood24 before and after making the amendments

Adolescents’ views on myfood24 & interviewer-administered 24 h recall

In stage-II, adolescents rated the ease of undertaking the two methods similarly; 32 (46 %) of them rated the interviewer-administered 24 h recall as very easy, 28 (40 %) as easy, 9 (13 %) neither easy or difficult, and 1 (1 %) difficult. Whereas 28 (40 %) rated myfood24 (live-version) very easy, 30 (43 %) easy, 9 (13 %) neither easy or difficult, and 3 (4 %) difficult. None of them rated any methods as very difficult. Moreover, 41 (59 %) of the adolescents stated that they preferred the interviewer-administered method and this was for two main reasons, namely “human interaction” and “easiness of completing the food diary” in terms of finding the exact food item, selecting FPS and being prompted to remember more food that was eaten. Most of adolescents who preferred the interview method believed that talking to an actual person would be much easier, friendly and more trustworthy as they depended on the interviewer’s knowledge and experience. Participant-37 said that “the interviewers were friendly and you can properly communicate with them”. Participant-45 said they were “better at jogging memory on what I have eaten and give more valid answers as to portion size to differentiate what size was eaten”. Participant-67 said “I didn’t have to do all the work” and another said that it provided “more help and explanation, reminded me when I forgot things”.

In contrast, 29 (41 %) adolescents preferred using myfood24. This was for three main reasons, namely confidentiality, simplicity of myfood24 and options availability, and instant nutritional feedback”. Participant-50 stated that “I prefer using myfood24 as it’s faster, easy and more efficient because some people like me don’t really like sharing what they eat”. Participant-59 mentioned that “It is easier to admit if you have eaten too much”. Participant-19 mentioned that “The website was quicker and easier to use and you can access the tool at any time”. Participant-69 said there was “No pressure to answer quickly” and that it was “easy to find food and FPS; interview feels too formal”.

Discussion

A number of usability issues were identified in the beta-version of myfood24. Most of these issues were related to navigation, presentation errors and failure to find functions. Consequently, the overall acceptability and SUS score was ok and the completion time was quite long, at 31 min. However, after addressing these issues by fixing the error and adding certain features to the food search, like a spell checker and auto-fill option, significant improvements were found in most functions. The completion time of myfood24 (live-version) was reduced to 16 min and the overall SUS was improved so as to be good. Adolescents rated the ease of using myfood24 similar to the interviewer-administered method and 41 % of adolescents preferred using myfood24 for its simplicity and options availability, as well as privacy in reporting dietary intake.

Few studies have formally tested the usability of online dietary assessment tools. Most of these published studies are generally focused on the description of development and functionalities of the system rather than detailing the methods used to assess the tool [30]. Other studies have only reported users’ acceptability with a comparison between using the system and using a traditional method, like YANA [39] and INTAKE24 [12]. Others, such as ASA24 [40] and CANAA-W [10], have used focus groups to evaluate the tool, making it challenging to compare the findings.

Most usability issues were related to searching for food items and selecting FPS, as they are the main functions of the tool. Similar to our findings with myfood24 (beta-version), in YANA [39], less than 20 % of adolescents strongly agreed with the simplicity of searching for foods and 29 % of children disagreed with the simplicity of finding of foods items in CANAA-W [10]. Although children liked food photos in CANAA-W, selecting FPS was sometimes difficult [10]. In ASA24, children were unable to understand what to do at a given point in time [40], and in SCRAN24 (developed into INTAKE24), adolescents were often confused and needed help due to the use of different interface screens and the many instructions to read [12].

Improvements were found in the usability and acceptability of most functions of myfood24 after implementing certain features such as a spell checker, auto-fill option, and adding ‘mouse hover’ to help with the use of some functions. Although pop-ups to prompt for forgotten food items seems to be a helpful function, some users in this study and CANAA-W study indicated that pop-ups can become irritating and they maintained that a reminder on the overview screen would be more helpful to remind them [10]. Limited number of children and parents recommended that starting with an example of how to use CANNA-W is a useful guide [10]. However, none of the adolescents used the help function in the second stage of this study after adding more information and video tutorials.

The maximum completion time in INTAKE24 reduced from 50 to 21 min after making the amendments in the proto-type-version, resulting in an average of 13.4 min (06.2 mini to 20.3 max) [12]. Similarly, the mean completion time of myfood24 reduced from 31 (SD = 6) minutes in the beta-version to 16 (SD = 5) minutes in the live-version. However, the average completion time for the online system should be considered as an estimate, as there are many other potential factors that can affect the completion time such as internet speed, the setting (school, home or others), how familiar the participant is with the system as well as the type of food participants have consumed (home cooked or ready meal for example). In the second stage, adolescents used myfood24 for two non-consecutive days which may have improved adolescents’ ability to use the tool. The setting also varied between stage 1 and 2 which could also have had an impact on completion time.

Traditional paper based dietary assessment method takes approximately 30 min to complete and a further hour to code the diary [41]. The overall SUS score of myfood24 improved from 66 (beta-version) to 74 (live-version) out of 100-points. Using the adjective rating scale produced by Bangor et al. [37], myfood24’s beta-version was associated with ‘Ok’ (can be defined as a marginal level but closer to acceptable level) and the live-version of myfood24 was associated with ‘Good’ as adjectives (an SUS score above 68 is considered to be above average and anything below 68 is considered to be below average). No results were found describing usability scales of other online or computerized dietary assessment tools. However, results from the SUS for a web-based physical activity intervention reported an average score of 73 (SD = 15), which is associated with ‘good’ usability [29].

Adolescents in this study ranked the ease of undertaking myfood24 (live-version) to be similar to the interviewer-administered 24 h recall, and 41 % of them stated that they preferred myfood24 as it was quicker, easy to use and provided adolescents with privacy when reporting their dietary intake. We did not find specific differences in user’s characteristics between adolescents who preferred myfood24 or interviewer-administer methods. Questions about risky or sensitive behaviour may be answered more truthfully when using computerised self-assessment tools [42]. Therefore, using new-technology with adolescents can be promising as it might help in reducing underreporting of energy-intake. Receiving an instant feedback was one of the best features adolescents liked in myfood24, and another study found that adolescents would more likely use the tool if they could receive instant feedback [43].

This study has strengths in terms of its design; it used standardised and remote usability-testing to cover a wide range of usability issues. Furthermore, the tool was evaluated before and after making the amendments, and one of the most established and validated usability metrics (SUS) was used to evaluate the tool and to our knowledge none of the available tools used usability metrics to evaluate dietary assessment websites’ usability and acceptability.

The study has a number of limitations. The live-version was only tested in a school environment rather than the home setting where participants are more likely to complete their dietary assessment in large studies. Exam and class pressures could have caused adolescents to rush during the second test and therefore introduce bias into the results. In accordance with previous research [44], Norman reports that website design and usefulness evaluation depend on users’ emotions and past experiences and it is therefore important to carry out testing with a diverse sample of participants. Participants in this study were mainly from a white ethnic background with smaller numbers from black and minority ethnic groups (BME) which may have affected the findings if different ethnic groups have different dietary patterns and food sources. Although this study sample is representative of the UK population where the adolescents were recruited; more research on the feasibility and acceptability of using myfood24 targeting different ethnic groups is warranted.

Conclusion

Usability-testing on a new online dietary assessment tool generated important information used to improve the usability and acceptability of the final-version of myfood24 among adolescents. myfood24 appears to support adolescents’ need in reporting their dietary intake, which may potentially improve the overall accuracy of adolescents’ self-reported dietary information. Further research is needed to determine socio-economic and ethnic differences in usability and acceptability of online dietary assessment tools.