1 Introduction

The introduction of the electronic Personal Health Record (ePHR) for all patients with statutory health insurance in Germany in January 2021 was intended to increase the effectiveness, efficiency, and transparency of patient healthcare [1, 2]. In this context, the ePHR, (in German, elektronische Patientenakte (ePA)), serves as a platform for collecting and sharing health-related documents generated during treatment, and is provided by health insurance companies. Healthcare providers, health insurers, and the patients themselves can add health-related data and documents. The main concept of the German ePHR is the self-administration of the ePHR and the data stored in it by the patient. The patient assigns access rights to service providers and decides on the visibility and accessibility of content on a case-by-case basis. Patients can add or remove data at will [3]. The ePHR is accessed by the patient via smartphone or a web application. However, the adoption of the ePHR in Germany by patients has been rather slow to date. Considerably fewer ePHRs were registered in the second half of 2022 (84,000) than in the first half of the year (177,000). These numbers highlight difficulties in the implementation of the ePHR. Despite slight increases in 2023, the overall usage of the ePHR is very low; just under 1% of people with statutory health insurance in Germany registered an ePHR [4].

The self-reliant management (by the patient) of the ePHR assumes that the user is competent and informed, with the appropriate ability to operate and understand the ePHR. Empowering competence in ePHR use is required to ensure that the ePHR can be effectively implemented, as literacy in dealing with technology is related to ePHR adoption [5, 6]. In Germany the concern is being raised as to whether older adults have the ability to use the ePHR. This is described as a restricting factor in the ePHR adoption [7]. In Germany the slow ePHR implementation is (among other factors) also attributed to a low demand voiced by patients, who are either unaware of the existence of the ePHR or are not interested in its use [8]. Successful ePHR implementation can have a positive effect on patient care and the efficiency of the healthcare system, which is well documented in the literature [9, 10]. However, in addition to competence in terms of the correct operation of the ePHR, the patient’s ability to understand and benefit from the health information presented is also crucial. Health literacy in general and eHealth literacy, in particular, are significant factors that contribute to both acceptance by patients and the adoption of the ePHR [11,12,13,14].

For older adults, this results in access barriers, often due to less experience of internet-based software and digital applications [15], as well as a decline in cognitive, perceptual, and motor abilities [16]. Due to longer medical histories, increased contact ratios with providers, and possible multimorbidities, older adults have the potential for a large amount of health data to be stored in the ePHR. The result could be an extensive quantity of available data that could benefit their healthcare provision but would need to be managed by the older adult for effective use [15, 17]. While the digitalization of services and procedures is often positively highlighted during implementation processes, these developments can exacerbate or create inequalities in participation, especially for older adults [18]. The potential benefit of implementing the ePHR is thus offset by the need for older adults to develop skills in using the ePHR competently [19, 20]. Among other things, Lober et al. [21] identified poor computer literacy and health issues as central barriers to the adoption of ePHR among the elderly population. They also discuss that older adults can be assisted by others in the use of the ePHR (for example by nursing staff), but autonomy is lost in the case of older adults who relinquish management of the ePHR [21]. Due to the very low number of users of ePHRs in Germany, the use and acceptance of this technology among the older population has not yet been scientifically investigated. However, concerns have been raised in the past about the accessibility of the ePHR in relation to the target group of older adults [7].

The problem of the interplay between the potential benefit of the use and the willingness to participate, on the one hand, and the actual limited use of the ePHR by older adults, on the other hand, has been known for some time [22, 23], but there are few established solutions to address this. In this regard, Luo et al. [24] observed in a study that ePHR use among older adults is dependent on ease of understanding of the stored health information. In their systematic review, Abd-alrazaq et al. [25] argue that to increase the willingness to use ePHR, measures such as training, education programs, and other assistance need to be implemented. Against the background of this problem, the ePA Coach project is designing and creating an e-learning platform, specifically adapted to the needs of older adults. The platform provides an empowerment tool to educate older adults on the use of the ePHR and the data stored in it, and to practice-specific application scenarios.

As usability of health technology plays a crucial role in its adoption by older adults [26, 27], the purpose of the explorative study presented here is to survey the experiences of older adults when using the learning platform ePA Coach, which was presented as a functional, high-fidelity prototype [28, 29]. The learning platform was developed using an iterative and participatory research process, taking into account the specific needs of the target population of adults over 65 years of age [30,31,32]. In this study, the usability of the e-learning platform will be evaluated, and the user experience of older users will be investigated. The results of this study will be used to improve usability of the learning platform in the future to ensure accessibility and facilitate its adoption. The usage behavior of the study participants will also be analyzed. Since it is known that the health literacy of individuals influences the acceptance and use of the ePHR, this study will investigate whether general attitudes toward internet use influence the accessing of health information, or whether other factors are at play when evaluating the learning platform. We conducted a study arm with an online intervention and an experimental face-to-face ‘group think aloud’ study arm with older adults. The study is part of the project ‘ePA Coach - Digital Sovereignty in the Context of the Electronic Personal Health Record’ - a project funded by the German Federal Ministry of Education and Research (BMBF, Grant No. 16SV8483). In the context of this study, the following research questions were to be answered:

1.1 Primary research question

How do older adults over 65 years of age rate the usability and user experience of the ePA Coach learning platform?

1.2 Secondary research questions

What is the relationship between the usability/user experience of the ePA Coach e-learning platform and the attitudes of older adults toward accessing health information on the internet?

What is the relationship between the usability/user experience of the ePA Coach e-learning platform and the ease of using this health information website?

What are the main usability problems identified by the face-to-face group during the use of the ePA Coach e-learning platform?

2 Methods

2.1 Trial design

The aim of this intervention study was to evaluate the ePA Coach learning platform used by older adults. Particular attention was paid to the assessment of the usability and user experience of the platform. The study was conducted as an online intervention on the participants’ own devices (PC, smartphone, or tablet) and in their own homes. The study was conducted between July and October 2022. The study’s design was chosen in light of the COVID-19 pandemic, which was still ongoing at the time of the study. The goal was to minimize the number of in-person study appointments. Due to the exploratory nature of the study and since the learning platform had not been used in a prior study, no sample size calculation was performed. The aim was to collect feedback from the target group of older adults to assess the usability of the platform. To gain a deeper understanding of the usability and user experience of the target group of older adults, the learning platform was additionally evaluated with a smaller group in a face-to-face appointment, based on the ‘think aloud’ method [33]. The study was registered in the German Clinical Trials Register (registration number: DRKS00029700) and received approval from the Ethics Committee of the Charité - Universitätsmedizin Berlin (application number: EA1/081/22).

2.2 Study procedure

To optimize the organizational and technical procedure of the study, a separate pilot test was conducted with a volunteer from the target group, who did not participate in the final study. The pilot test resulted in minor subsequent changes to the process and design of the study.

Interested subjects were briefed on the nature, procedure, and objective of the study, using a document provided (study information for potential participants). Informed consent to participate in the study was given online in form of a checkbox, as agreed with the responsible data protection authority. The face-to-face group received on-site information, had the opportunity to ask questions about the study, and were then able to sign the informed consent form.

After consenting to participate in the study (online group), study participants received a link to a baseline survey on the REDCap (Research Electronic Data Capture; Vanderbilt University) online questionnaire platform [34, 35]. REDCap is a tool for conducting online surveys, which is particularly suitable for online studies in terms of data protection, due to cooperation with university institutions. We surveyed the baseline characteristics of the participants (such as age, sex, previous experience in using the ePHR, previous experience with e-learning, etc.). Furthermore, the questionnaire, eHealth Impact Questionnaire (eHIQ)-Part 1 [36] was included within the REDCap platform as a baseline assessment. After completion of the assessment, participants received a link to the ePA Coach learning platform, where they could create a user account by utilizing a combination of username (e-mail address) and password, chosen independently. Subsequently, the learning content on the platform could be accessed and freely processed in any order. The study period was one week, starting after the completion of the baseline questionnaire. During the one-week intervention period, the participants could complete the learning content at their own pace, following the intention-to-treat principle [37].

At the end of the seven-day usage period, a follow-up assessment was conducted on the REDCap platform. For this purpose, we sent the study participants a link to a survey in a separate e-mail. The questionnaires System Usability Scale (SUS) [38], eHIQ - Part 2 [36], and the User Experience Questionnaire (UEQ) [39], were used. Since we aimed to examine the usage behavior of the study participants over the course of the week, the pseudonymized logging data (e.g., duration and frequency of use) were analyzed after completion of the follow-up survey.

2.3 Face-to-face group

In addition to the group of participants who were given online access to the learning platform, a second study arm was conducted. In this study arm, we carried out the same basic study procedure, but the use of the platform by older adults was designed as face-to-face group training during a one-day appointment. The participating older adults were invited to the research group’s facility and each participant was provided with a laptop to use the learning platform for three hours. While using the learning platform, participants had the opportunity to interact with each other and with the study staff regarding the learning content and the use of ePA Coach. In a ‘group think aloud’ approach, participants were asked to point out any difficulties and comprehension problems. This method has been established as a useful tool for usability testing and has been applied to ePHR usability testing in the past [40]. While using the learning platform, the study participants had the opportunity to note down usability problems on a piece of paper, which was collected after the session. Likewise, comments or queries from the participants were recorded by the study staff, who also noted any interesting observations regarding the use of the platform by the older adults. The objective was to survey the usability and user experience in a face-to-face group setting, which allowed us to include less tech-savvy participants. This testing also provided the opportunity to gain a deeper insight into the usability difficulties encountered by the participants.

3 Materials

3.1 The ePA coach learning platform

As part of the ePA Coach research project, a learning platform was developed in the form of a website, tailored to the needs of older adults. The general needs and preferences of older adults regarding an empowerment tool for the ePHR were investigated in iterative surveys and evaluations [30, 31]. The results were implemented according to a target-group-oriented design of the learning platform and thus directly influenced the development. This resulted in the usage of a simple color scheme and menu navigation, the option of changing the font size, the use of strong contrasts between texts and the background, and the avoidance of foreign language terms. The Octalysis framework was used as the basis for the design of the gamification elements for user motivation and was adapted in the context of previous user studies [41, 42]. As a result, the motivational elements that met the needs of the target group of older adults were implemented, such as visual progress bars and interactive exercises. In addition, the learning content was divided into three levels within thematic areas. The levels were ‘beginner’, ‘advanced’, and ‘expert’. Each level was based on different didactic concepts and included unique content. This classification was based on the complexity of the content as well as the cognitive dimensions to be addressed [43]. A micro-learning approach was adopted so that the competencies required for confident use of the ePHR could be effectively acquired. The educational approach of the learning platform was based on the DigComp framework of the European Union [44]. Based on this framework, the specific learning requirements and preferences of the target group of older adults were identified and considered in the content creation process. Thus, the categories provided by the framework were supplemented with the learning needs of the target group. The learning platform offered self-contained learning units that the user could choose individually in any order and work on in accordance with their own interests. A total of 14 learning units were available. The learning units were multimedia-based and offered both visual (videos, pictures) and textual content (see Fig. 1). Each learning unit had an estimated completion time of approximately six to eight minutes.

Fig. 1
figure 1

Screenshots of the ePA Coach learning platform: (A) Landing page with introduction and instructions, (B) Overview of the available learning topics, (C) Exemplary interactive learning task using ePHR mockup, (D) Exemplary learning unit with video content; we translated the screenshots from German to English

The learning progress was tracked and saved from session to session and the units contained interactive elements that facilitated learning and encouraged users to practice their learning. Interactive mockups of an ePHR allowed users to practice the navigation and operation of specific tasks without having to use real health data and were included in various learning units. Additional interactive elements such as exercises or tips were designed to increase learning motivation.

A virtual companion was integrated in the form of a chat-based learning assistant that was constantly available to support users in their questions relating to topics such as the functionality of the platform or specific questions about the ePHR (e.g., ‘How do I change the font size?’ or ‘What exactly is the ePHR?’). We used the open source version of the RASA Conversational AI software [45]. To build a dialogue model with RASA we provided ‘intents’, which are used to detect the intent of the user during the conversation. For each intent, one or more ‘responses’ were made available. We also created so-called ‘stories’ as training data for the dialog model, which are used as typical conversations for each intent. The intents were built by providing example questions that are used for their detection. For the ePA Coach stage, which was used for this study, we created intents regarding the ePHR-related topics that are addressed in ePA Coach, as well as questions on how to use the platform. Additionally, we added questions and answers based on gematik’s FAQ [46], gematik being an agency responsible for providing a regulatory framework for the ePHR in Germany. Altogether the dialog model of the interim evaluation consisted of 49 intents, using 588 example questions and 54 responses. Likewise, we integrated a glossary into the website, which provided explanations of terminology that might not be familiar to the target group of older adults.

The available learning content is related to the following overarching topics:

  1. 1.

    Basics of the ePA: Understanding the basic concepts of the ePHR;

  2. 2.

    Manage health data: Managing data, information, and digital content stored in the ePHR;

  3. 3.

    Health data security: Access rights and security of health data in the ePHR.

3.2 Assessments

3.2.1 System usability scale

The SUS is a short questionnaire to assess the usability of systems and products [38, 47]. For the purpose of measuring the perceived usability of software and hardware, the SUS is the most established questionnaire used in multiple studies in various research areas [48]. The SUS is a frequently used assessment toll used both in evaluating e-learning [49, 50] and eHealth applications [51,52,53]. It contains 10 items, which use a five-point Likert scale (1 = ‘strongly disagree’, 5 = ‘strongly agree’). The calculated total score ranges from 0 to 100 (perfect usability). We used the validated German version of the SUS [54].

3.2.2 User experience questionnaire

The UEQ is an instrument employed in user experience research to measure users’ perceptions when interacting with systems or services [39]. It comprises a set of dimensions, including attractiveness, efficiency, perspicuity, dependability, stimulation, and novelty. The questionnaire uses a seven-point Likert scale ranging from values 1 to 7. The raw scores are then transposed into values between − 3 (horribly bad) and + 3 (extremely good). The German version of the UEQ has been validated and was utilized in this study [55].

3.2.3 eHealth impact questionnaire

The eHIQ is a standardized and validated tool used to assess attitudes toward health-related information on the internet, and the learning experience associated with such websites and eHealth interventions, in general [36]. It consists of two sections with a total of 37 items. Part 1 consists of two subscales and surveys general attitudes toward the Internet as a source of health information; it was also used as part of our baseline assessment. Part 2 includes three subscales and was used as part of the follow-up assessment, gathering information about respondents’ experiences while using an eHealth intervention. Both parts of the assessment use a five-point Likert scale, ranging from ‘strongly disagree’ (1) to ‘strongly agree’ (5). The scores are converted to a 0-100 metric, where 0 represents the lowest possible negative value and 100 corresponds to the highest possible positive value for each respective subscale.

The eHIQ subscales are:

  • eHIQ-Part 1:

    • Attitudes toward online health information (S1);

    • Attitudes toward sharing health experiences online (S2).

  • eHIQ-Part 2:

    • Confidence and identification (S3);

    • Information and presentation (S4);

    • Understanding and motivation (S5).

The eHIQ was originally published in English and translated into German by two independent scientists from our research group who were not part of the study staff. The two translations were undertaken individually, then compared and a consolidated version was produced from both versions. The translation was then shared and reviewed with the authors of the original assessment. Their comments and suggestions were incorporated into an updated version of the translation and then approved as a final version by the original authors.

3.2.4 Recruiting

Interested individuals were contacted directly, via multipliers (newsletters of senior citizens’ associations and institutions) or via the internal database of the research group (notifying individuals who had given their consent to be contacted for the purpose of the study). In this process, they received specific information for potential study participants. In these documents, the principal investigator of this study was named, who could be contacted if there were any questions. Likewise, the recruitment e-mail included a link to the REDCap platform where the screening took place, and consent for participation was obtained.

The recruitment of the face-to-face group was organized via senior citizen facilities and senior citizen cafes.

3.2.5 Inclusion and exclusion criteria

Prior to the start of the study, the eligibility of the potential study participants was checked by assessing the inclusion and exclusion criteria on the REDCap platform. Only after meeting the inclusion and exclusion criteria could consent to participate in the study be given. The following inclusion and exclusion criteria apply to the study subjects of both groups for the pretest, as well as for the general study.

Inclusion criteria:

  • Age: ≥ 65 years;

  • Internet access at home;

  • Availability of a device to access websites;

  • Ability to read and understand German-language texts.

Exclusion criteria:

  • Diagnosed cognitive disorders;

  • Sensory and/or motor deficits that prevent the use of websites or the completion of online questionnaires;

  • Legal guardianship.

3.3 Data analysis

Quantitative study data were exported from the REDCap platform and analyzed using SPSS statistical software (IBM SPSS statistics version 27; IBM Corp., Armonk, NY, USA). Solely data from questionnaires that were completed in full were analyzed, and only the datasets with both baseline and follow-up assessments were used. We used descriptive statistics for data analysis in this study. Mean values as well as standard deviations were calculated. Inductive statistics were used to determine whether there were differences in the participants’ characteristics between the participants included in the analysis and those lost to follow-up. In addition, the dataset was tested for normal distribution using the Kolmogorov-Smirnov and Shapiro-Wilk tests. Since a normal distribution was present, correlations between variables were determined by calculating Pearson correlations. To calculate the Pearson correlation, the linear correlation of the variables was verified, and we ensured that there were no outliers by using a graphical representation. We used the standard alpha level of .05 for the interpretation.

In the face-to-face group, problems and obstacles in relation to the use of the learning platform or the comprehension difficulties of older adults during use were noted by the study staff and the users themselves. Based on the ‘think aloud protocol’ [33], participants were instructed to verbalize their impressions, as well as difficulties while using the learning platform. These impressions, in conjunction with observations made by the study staff, were documented in a protocol. The notes and statements were then paraphrased, data reduction was performed and inductive code groups were created. This procedure as well as the analysis of the resulting data set was carried out according to Mayring [56].

4 Results

4.1 Sample description

A total of 149 participants completed the online baseline assessment, while 57 completed the follow-up evaluation. After eliminating the datasets of participants with duplicate participation, a dataset of 53 participants, who completed both the baseline and the follow-up survey, were included in the analysis of the online study arm. Table 1 shows the characteristics of the online and face-to-face groups. The mean age of the participants in the online group and the face-to-face group was 72.75 years and 73.67 years, respectively. More men than women participated in the online group (56.6%), while more women participated in the face-to-face group (66.7%).

The participants who completed both the baseline and follow-up evaluation showed significant differences to those who were lost to follow-up in terms of computer (U = 1858.50, Z=-2.077, p = .038) and tablet (U = 1762.50, Z=-2.059, p = .039) usage. The dropouts used both technologies less frequent.

Table 1 Participants’ characteristics

4.2 The eHIQ and the SUS

The respondents who participated in the online evaluation reported a mean score of 62.00 (SD = 15.52, 95% CI [59.86, 65.14]) with regard to their attitudes toward online health information and a score of 60.37 (SD = 17.09, 95% CI [55.74, 65.00]) in respect of their attitudes toward sharing health experiences online (Table 2). The mean values for the face-to-face group were lower (S1: 53.33, SD = 16.02, 95% CI [40.53, 66.13], S2: 55.56, SD = 10.43, 95% CI [47.23, 63.89]). In the case of the eHIQ subscales S3, S4, and S5 in the online group, subscale S4 (information and presentation) had the highest mean score (69.50, SD = 14.21, 95% CI [64.88, 74.12]).

The SUS mean score was 67.97 (SD = 19.20, 95% CI [60.56, 75.38]) for the online group and 70.41 (SD = 11.56, 95% CI [62.24, 78.60]) for the face-to-face group.

Table 2 Results of the eHIQ and the SUS

4.3 The UEQ

The results were compared to benchmark data from 20,190 participants across 452 studies, based on various products such as software and web pages [39]. The e-learning platform was divisible into five scales (attractiveness, efficiency, dependability, stimulation, and novelty), which were below average compared to the benchmark (Table 3). Perspicuity was the only scale which received an above-average rating compared to the benchmark.

Table 3 UEQ results of the scales of the online group

The scales can be subdivided into pragmatic quality (perspicuity, efficiency, and dependability) and hedonic quality (stimulation, novelty). The pragmatic quality was 0.97 and the hedonic quality was 0.61 in our sample (Fig. 2).

Fig. 2
figure 2

Visualization of the UEQ scores compared to the benchmark; error bars represent the 95% confidence interval (online group)

4.4 Usage behavior of the online group

The learning platform, ePA Coach, was used by the online group for an average of 5717.69 s during the intervention period of one week (Table 4). This corresponds to an average usage time of 95.29 min. The platform was used on average over a period of 1.94 days and 2.29 logins/sessions. This means that the learning platform was used for an average of 41.61 min per session.

Table 4 Usage behavior of the online group

4.5 Correlation of the eHIQ subscales with the SUS and the UEQ

4.6 Correlation of the eHIQ and the SUS

We calculated the Pearson correlation coefficient (Pearson r) between the eHIQ subscales and the SUS scores of the online group to gain an insight into which variables might correlate with the evaluation of the usability of our e-learning portal. The correlations between the eHIQ subscales, S3, S4, and S5, and the SUS score were all significant (Table 5). The strength of the correlation was moderate between the eHIQ subscales, S3 and S5, and the SUS score, and was strong between the S4 subscales and the SUS score [58].

Table 5 Correlation of the SUS with the eHIQ subscales (online group)

4.6.1 Correlation of the eHIQ and the UEQ

Furthermore, the Pearson correlation calculation between the eHIQ subscales and the UEQ scales in the online group was performed to ascertain whether the user experience of our e-learning portal, rated by the older adults, correlates with their attitudes toward health information on the internet and their eHIQ rating of our e-learning portal. The relationship was not significant between all scales of the UEQ and the eHIQ subscale, S1 (attitudes toward online health information), or between all scales of the UEQ and the eHIQ subscale, S2 (attitudes toward sharing health experiences online).

The eHIQ subscale, S3 (confidence and identification), and all scales of the UEQ were positively correlated: attractiveness (r(44) = 0.63, p < .001), perspicuity (r(44) = 0.47, p < .001), efficiency (r(44) = 0.37, p < .012), dependability (r(44) = 0.43, p < .003), stimulation (r(44) = 0.63, p < .001), and novelty (r(44) = 0.47, p < .001).

The eHIQ subscale, S4 (information and presentation), and all scales of the UEQ were also positively correlated: attractiveness (r(42) = 0.66, p < .001), perspicuity (r(42) = 0.75, p < .001), efficiency (r(42) = 0.57, p < .001), dependability (r(42) = 0.68, p < .001), stimulation (r(42) = 0.55, p < .001), and novelty (r(42) = 0.37, p < .014).

As with the correlations above the eHIQ subscale, S5 (understanding and motivation), all scales of the UEQ were also positively correlated: attractiveness (r(42) = 0.66, p < .001), perspicuity (r(42) = 0.52, p < .001), efficiency (r(42) = 0.36, p < .016), dependability (r(42) = 0.50, p < .001), stimulation (r(42) = 0.65, p < .001), and novelty (r(42) = 0.55, p < .014).

All correlations between the UEQ scales and scales S3-S5 of the eHIQ were significant. Especially notable were the high correlations between S3 and S5 of the eHIQ and the UEQ scale, attractiveness, which represents the overall impression of the e-learning portal.

4.7 Face-to-face group - main usability problems

The usability problems observed by the study personnel and reported by the older participants regarding the use of the learning platform during the face-to-face group can be divided into six main categories. These categories are shown in Table 6.

Almost all participants had problems registering on the learning platform and creating a user account. We observed that many participants wished to use a password with which they were familiar, however, this was not accepted on the learning platform, due to the password not meeting the requirements for a secure password (as demanded by the learning platform). Similarly, many users were unfamiliar with the function for displaying the password as they typed it, resulting in many typing errors. In addition, it was possible to select a user avatar during the registration process, which irritated participants because they did not understand the purpose of this function.

As a second issue, navigation on the learning platform appeared to be an obstacle for some individuals, as they lost track of the learning units they had been working on or had problems switching between pages correctly. Similarly, most participants had no prior experience of the chat-based assistance tools and were unable to minimize the window for chat input once opened. Another common usability issue resulted from the YouTube integration for videos. Two videos, available as part of the learning units, were integrated via YouTube, which resulted in some users being shown additional external links after the video. This caused some participants to leave the learning platform. Similarly, some participants considered the default font size to be too small. The learning platform offers the option of setting this individually, however, not all participants were able to locate this function.

Table 6 Overview of main usability problems in the face-to-face group

5 Discussion

In summary, our study revealed that the usability of the learning platform was rated positively overall by the older adults, but still showed potential for improvement. However, the user experience was mostly rated below average compared to the benchmark. In the face-to-face group, key aspects were identified that must be taken into account during the further development of the learning platform. It became apparent that while the learning content was regarded as particularly positive, the usability of the platform could still be improved in terms of addressing the target group-oriented design.

When examining the usage behavior of the participants of the online group in relation to the learning platform used, it became clear that the usage was distributed over several sessions on average. The platform offered 14 learning units, for which we assumed an average completion time of up to eight minutes. This corresponds to a total usage time of 112 min. In our study, the average time spent on the platform was 95.2 min. However, it is not possible to deduce from the usage time and frequency of use how many and which of the learning units have been completed.

The mean rating of the SUS for the online group was 67.9, which can be interpreted as rangeing between ‘ok’ and ‘good’ usability [59]. This score is slightly higher than the mean score for internet platforms for e-learning reported in the systematic review by Vlachogianni and Tselios [50]. They calculated a mean score of 66.25 (SD = 12.42) which considers the results of 77 studies. They did not focus on e-learning platforms for older adults, but found no significant relation between age of the participants and the SUS scoring. The Web Content Accessibility Guidelines for the development of websites list aspects that are seen as crucial for ensuring the good usability of websites for older adults [60]. Many of these aspects were taken into account in the development process of the ePA Coach learning platform and its content. For example, a single-colored light background was chosen that offered a clear contrast to the written text, as well as a large font size. Moreover, a consistent design was incorporated throughout the website. Other recommendations include offering alternative media formats, in addition to text, when presenting learning content for older adults [61]. This was also considered in the learning content creation of the ePA Coach, as the website includes video material as well as practical usage scenarios. Despite the overall positive quantitative assessment of the usability of the learning platform in the online group, it became apparent during discussions with the face-to-face group that some of the learning platform’s usability aspects still needed improvement. The difficulties mentioned by the participants, which traditionally arise in the case of the target group of older adults when using web content, were issues with font size, navigation, and the layout of websites. Although we had identified these issues early on in the development of the ePA Coach and in previous studies we had conducted with the target group [31], it became apparent that these aspects had still not received sufficient attention in the development of the learning platform. Nelson et al. [62] reported similar problems regarding layout and navigation concerning the empowerment tools related to ePHR during usability testing. However, this study included younger, chronically ill patients. For older adults, as investigated in our study, these aspects of usability may be even more important, since the difficulties encountered by those with a potentially lower affinity for technology need to be addressed.

Other issues that emerged from the results of the face-to-face group, in particular, were problems relating to navigating the learning platform. The use of micro-learning content and the educational approach of an individual learning path is widely recommended and described as motivating for older adults [63]. Pappas et al. [63] also note that short, individual learning units are adequate for older learners and that subdivisions of learning content into modules are also adequate. In addition, they recommend the inclusion of practice questions and examples [63]. On the other hand, Sheridan [64] points out that, especially in the case of complex e-learning programs, navigational aspects can be a challenge for older adults, as was evident in our study. It may be possible that the problems in navigating the ePA Coach learning platform that we identified in our study can be attributed to an overly complex structure in the presentation of the modules and an unintuitive menu design.

Due to the high correlation between the SUS and the UEQ, Takano et al. [65] recommend a complementary application of both assessments in their systematic review, to ensure a more holistic understanding of user experience. Within our sample, the pragmatic quality in the UEQ achieved a rating of 0.97, whereas the hedonic quality attained a score of 0.61. Values exceeding 0.8 indicate a positive assessment, while values below − 0.8 signify a negative evaluation [57]. The results of the UEQ showed that perspicuity was rated above the benchmark value. This means that the e-learning platform has been successfully designed as a low-barrier website, which was not obvious given the complexity of the topic. However, the hedonic quality (stimulation, novelty) needs improvement. The stimulation can be improved, e.g., by using more interactive ePHR mockups. More practical, hands-on experience could increase the hedonic quality and thus the attractiveness.

Sheridan [64] notes that on-screen support is needed for older adults when using e-learning services. We incorporated a chat-based learning assistant for this very purpose. However, according to the face-to-face group, its usage caused certain issues. It became apparent that our chatbot was too complex and was in fact a distraction. Therefore, it can be assumed that for target groups with little experience of using such systems, other on-screen support options may be more appropriate.

To investigate the attitudes of the study participants toward online health information and the impact of using our e-learning platform, we included the eHIQ. Since the authors of the original eHIQ report had no cutoff for the interpretation of the subscale scores, we used the considerations of Talboom-Kamp et al. [66] as a guide. We therefore interpreted a value of 65 as a cutoff for a positive evaluation. The achieved scores of the eHIQ subscales, S1 and S2, were higher in the online group of our study than in the validation study of the German population [67]. This may be the case because the participants in the present study had a high interest in health-related content on the internet and, therefore, decided to participate in our study. This is also apparent in the relatively high frequency of technology usage (as indicated by the frequency of smartphone usage).

When looking further at the results of the eHIQ subscales, it is noticeable that the content of the learning platform was rated particularly positively. The subscale S4 evaluated the way in which the study participants perceived the content and its preparation. This subscale received a mean score of 69.5.

Kang and An [68] mention that, in relation to current health-related information and communication technology for older adults, not only ease of use (which we have considered here in terms of usability) but also perceived usefulness are decisive in the assessment of such systems. Furthermore, eHealth literacy, self-efficacy, and communication preference are assumed to have a clear impact on the acceptance and use of ePHR [22, 23]. When looking at the study data, it is noticeable that there is a positive strong correlation between the attitude toward health information on the internet and the usability evaluation of the learning platform within the online group. It is possible that a certain degree of skepticism toward health-related information on the internet influenced the perceived usability of our platform. Irizarry et al. [69] show that eHealth literacy and attitudes toward health information online have an impact on the intention to use an ePHR. Young et al. [70] also mention that the attitude toward technology (in our study: toward health information on the internet) influences the adoption of the ePHR. Thus, we investigated whether some aspects related to eHealth literacy had an impact on the usability of the e-learning platform, ePA Coach. We observed that the eHIQ subscales’ criteria of attitude toward online health information, confidence and identification, information and presentation, as well as understanding and motivation, showed a positive correlation with the usability rating of the learning platform. This indicates that eHealth literacy played a role in the evaluation of the ePA Coach.

A correlation calculation was also carried out between the eHIQ scales and the UEQ scales in the online group. The participants’ evaluation of our health-related website, measured with the eHIQ scales S3-S5 (confidence and identification, information and presentation, understanding and motivation) were positively correlated with the UEQ. These results can be partly explained by the constructs of the model by Zardari et al. [71]: User Experience-Based E-Learning Acceptance Model for Sustainable Higher Education. The model by Zardari et al. showed that information quality may affect the users’ behavioral intention to use the e-learning offer. This may partly explain why information and presentation, measured by the eHIQ, correlate with user experience and scales such as perceived attractiveness.

Zardari et al. also postulate that the construct, appeal, has a positive influence on satisfaction with regard to the use of an e-learning portal. Zardari et al. define the construct, appeal, as a positive emotional reaction toward an e-learning website. In the broadest sense, this also refers to the understanding and motivation (eHIQ S5) of an e-learning platform. In our results, the understanding and motivation subscale also correlated positively with the user experience in all scales. Self-efficacy has a positive influence on the behavioral intention to use an e-learning portal in Zardari et al.’s model. The eHIQ scale, S3, confidence and identification, has similarities with self-efficacy, which also correlated positively with user experience.

Our results may contribute to a deeper understanding of the usability of learning platforms in relation to older adults, in particular, helping to empower them to use an ePHR. Incorporating different methodological approaches for usability evaluation allowed us to assess the platform against a larger population and still receive precise feedback on usability issues. This corresponds to the recommendations of Baharum et al. [72], who examined the usability evaluation of different eHealth applications in their literature review. They state that although evaluation by questionnaire is the most common survey method, other methodological approaches, as we have used in the present study, will provide a better understanding of usability strengths and weaknesses.

5.1 Limitations

Our study employs a study design allowing to examine the usability and user experience of the rarely researched target group of older adults, using an e-learning website. Another strength of the study is the investigation of an innovative empowerment tool for use with an ePHR, which will become increasingly important in the German healthcare system in the future. Nevertheless, the study did have certain limitations that might influence the interpretation of the results. An a priori sample size calculation was not carried out, as an exploratory study design was chosen. Another limitation was the absence of a control group. Due to the small sample size, the face-to-face group was not suitable as a control group and the application of inductive statistics was not possible. The absence of a control group limits the internal validity and introduces potential selection bias. The randomization of participants and the blinding of both participants and researchers was also not possible. This may have resulted in observation and confirmation bias. Due to the intensive supervision of the participants in the face-to-face group, only a limited number of participants could be included with the available study personnel and resources. As the statements in the face-to-face group were not audio recorded and transcribed, but recorded using protocols, some bias may have been introduced here. Since the usability issues of the face-to-face group were only documented but not subjectively rated, no conclusions can be drawn about the severity of the problems. Furthermore, the study population largely consisted of individuals with a relatively high frequency of smartphone and other technology usage. This indicates a potential high affinity for technology. As a result, the findings of our study may not be generalizable for parts of the target group of older adults with a lower affinity for technology. The online study group also had a relatively high dropout rate (63%). Since those lost to follow-up showed significantly less frequent computer and tablet usage, participants with less affinity or experience with technology might have been more likely to dropout. Problems in using either the questionnaire or the e-learning platform might have been reasons for dropping out of the study. It is also plausible that not all participants who took part in the baseline survey were fully aware that the study was an intervention study with a one-week duration. Some may have expected a one-time survey and, therefore, did not wish to participate further in the study after completing the questionnaire. Additionally, we also identified issues relating to the accuracy of the e-mail addresses provided for receiving study links.

6 Conclusion

This study provides key insights into the usability of a learning platform in the context of the ePHR. These insights will be used for further development to improve usability and user experience. The results of the study can also help other researchers in the field of e-learning for older adults and in the context of interactive eHealth portals to gain insights into improving the usability of such systems.

If the ePHR is to be used by larger numbers of the German population in the future, the need for programs to empower users of the ePHR will increase. The use of the ePHR in Germany is not yet widely established, as many structural prerequisites (such as the use of the ePHR by healthcare providers) are lacking. As a result, there is a limited opportunity at present to use one’s ePHR in a real-life scenario and to gain experience and confidence in its use. Therefore, the fact that interactive ePHR mockups were included as a central element in the learning platform, which will be evaluated in more depth in further studies, may have even greater relevance.

Overall, the ePA Coach platform can be an important building block as far as the implementation of ePHR in Germany is concerned. A prerequisite for the use of the learning platform is good usability, which may be achieved by implementing the results of this study. The empowered use of the ePHR by older adults could be supported by the use of the ePA Coach, which could lead to better care and participation of the target group in the healthcare system in the long term. This could be achieved by saving costs and providing better and more effective healthcare. However, to determine the actual effect and benefit of using the ePA Coach learning platform for older adults, a future study is needed to examine not only the usability but also the learning success, in order to gauge the improvement of older adults in terms of knowledge and competence.