Method
The second study’s goal was to validate the Samply Research mobile application using the participation data from an external survey. The study was conducted in June and July 2020 in two student projects carried out during the experimental practicum at the University of Konstanz, Germany.
Because the study’s main goal was validation of the mobile app, we will omit a description of the underlying results obtained in the two student projects and instead focus on their participants’ response and dropout rates. The first project studied the effects of gratitude on well-being, whereas the second project examined the role of time management techniques in increasing the efficiency of studying. Participants were recruited via the University of Konstanz’s participant management system, SonaSystems, and through personal contacts with students.
Materials
The Samply Research mobile application
The Samply Research mobile app was available for free download from Google Play or the App Store. Participants in both projects completed the first browser-based survey online, receiving instructions on how to download and use the app. In order to match participants’ IDs in the Gratitude project, a random four-digit participant number was generated and displayed in the first survey together with the instructions for entering it in the mobile app. In the Time Management project, participants were asked to create and enter their own personal identification number (e.g., by using the initials) in both the online survey and the mobile app.
After the participants had installed the application and joined the projects, the scheduled notifications began. When participants tapped a notification in their smartphones, they were redirected to the first page of the mobile survey. The web link contained the participants’ Samply ID, so that each ID was recorded inside the survey and could later be matched with Samply’s data.
Web surveys and notification schedules
The projects’ web surveys were constructed in the lab.js experiment builder (https://lab.js.org, Henninger et al., 2020) and hosted on the Open Lab platform (https://open-lab.online).
The Gratitude project’s schedule sent notifications once a day for 14 consecutive days at a random time between 6 p.m. and 8 p.m. The notification contained a link to the project’s web survey that asked participants to list things they felt grateful for that day in one condition and things that had happened to them during the day in the control condition. Each daily survey took several minutes to complete.
The Time Management project’s schedule sent notifications once a day for 13 consecutive days at a random time between 6 p.m. and 8 p.m. The daily survey asked two questions, one on well-being and one rating personal studying progress.
The data and materials are available at OSF (https://osf.io/u57aw/), and both student projects were preregistered.
Results
The Samply Research mobile app was installed by 76 participants, 44 of whom participated in the Gratitude project, 26 of whom joined the Time Management project, and 6 of whom participated in both. Thus, the Gratitude project collected data from 50 participants, and the Time Management project gathered data from 32. Eight participants failed to complete their daily surveys (NGr = 3 and NTM = 5), so the final samples for the analysis consisted of 74 participants (NGr = 47 and NTM = 27) (see Table 3 for the participation rate).
Table 3 Student project’s participation rates Compliance rate
The overall compliance rate, defined as the percentage of notifications that were followed by participation in a survey, was 65%, i.e., 657 out of 1011 notifications were followed (Med = 76.9, SD = 28.4).
To analyze the effects of the project and the participation day on the compliance rate, we applied a multiple linear regression model, using the participation rate (0–100%) as a dependent variable and project, participation day, and their interaction as predictors (see Fig. 6 for the compliance rate). The model explained a significant proportion of variance in the compliance rate, R2 = .58, R2adjusted = .52, F(3, 23) = 10.37, p < .001. The main effect—which project was being participated in—was significant, indicating that the compliance rate in the Gratitude project, M = 69.8, Med = 71.3, SD = 10.7, was significantly higher than in the Time Management project, M = 56.1, Med = 57.1, SD = 9.4; b = −0.22, t(23) = −3.71, p = .001; unstandardized coefficients are reported. The main effect of participation day was also significant, showing that the compliance rate decreased by about 2% a day, b = −0.02, t(23) = −3.70, p = .01. There was no significant interaction between the project and the day, indicating that the decrease in participation over time was not significantly different between the two projects, b = 0.01, t(23) = 1.71, p = .10.
Response time
Study 2’s aim was to evaluate the Samply Research mobile app using participation data from student project surveys. Besides the compliance rate, another informative measure of an app’s usage is response time, which is defined as the time between when the notification is sent and when the survey web page is opened. Overall, the median response time across both projects was 52 minutes 34 seconds (in hours, Med = 0.9, M = 2.9, SD = 5.0). The response time was not significantly different between the Gratitude project, Med = 0.8, M = 2.7, SD = 4.6, and the Time Management project, Med = 1.1, M = 3.4, SD = 5.4, as confirmed by a two-sided Wilcoxon rank-sum test, W = 45346, p = .38. Response times did not change significantly over the course of the projects, as indicated by the day’s lack of any significant effect on the response time in a linear regression model, b = −0.03, t(660) = −0.62, p = .54, model fit: R2 < .001, R2adjusted < .001, F(1, 660) = 0.38, p = .54 (see Fig. 7 for response times).
Interaction with notifications
The mobile app offered two ways to interact with notifications: tapping the notification in the smartphone’s notification bar or inside the mobile application. We analyzed the proportions and response times of interaction events to understand differences in the use of these two options. Of the 657 notifications that resulted in a survey response, 33% (n = 219) were opened by tapping the notification and 62% (n = 406) were opened from the app interface. Interestingly, 2.5% (n = 16) of notifications were both tapped and opened in the app interface, and 2.5% (n = 16) of the notifications had missing information about the user interactionFootnote 4. Analyzing the same data by participants, we found that 39 participants (53%) both tapped and used the app to open the survey link, 11 (15%) only tapped, and 24 (32%) only used the app interface to open notifications.
The median time between when a notification was sent and when a participant tapped on it in the notification bar was 21 minutes 24 seconds (in hours, Med = 0.4, M = 1.9, SD = 4.1). The median time between when a notification was sent and when the survey link was opened in the mobile app was 1 hour 24 minutes 28 seconds (Med = 1.4, M = 6.8, SD = 25.6). The higher standard deviation in the latter case indicated the presence of outliers (in this case, participants who responded to the notification the next day).
Device effects
Based on the information recorded about their device when participants took part in the survey, 55% (n = 41) used Android smartphones and 45% (n = 34) used iOS. Comparing these participants’ percentage of notifications that were followed by participation in a survey (from 0 to 100%) using an independent two-sided t test, we found no significant differences between Android users (M = 66.3, SD = 29.9) and iOS users (M = 62.6, SD = 26.8), t(72.55) = 0.57, p = .57, d = −0.13, 95% CI [−0.59, 0.33]. There were also no significant differences in the individual median response times between groups of participants with Android, in hours Med = 0.9, M = 2.1, SD = 3.4, and iOS, Med = 1.8, M = 3.5, SD = 4.9, t(56.84) = −1.46, p = .15, d = 0.35, 95% CI [−0.11, 0.81].
Discussion
Compliance rate
The second study’s goal was to validate the Samply Research mobile app’s feasibility with the participants of two student experience-sampling projects. We also used data from these external surveys to measure response times and compliance rates. The average compliance rate in our validation study (65%) was lower than the 75% rate reported in a meta-analysis of studies with more than six daily notifications (Wen, Schneider, Stone, & Spruijt-Metz, 2017) and was closer to the average compliance rate of 70% (SD = 23%) calculated in the meta-analysis by Van Berkel et al. (2017).
Interestingly, compliance rates differed between the two student project studies, which might have been due to their subject matter (one related to time management), the interactions between researchers and participants, or the sample characteristics (see meta-analysis in Vachon, Viechtbauer, Rintala, & Myin-Germeys, 2019). The Gratitude project, with its higher compliance rate, may have established a stronger connection between participants’ efforts and the intrinsic reward of self-reporting, which is a motivational factor acknowledged in previous research (Van Berkel et al., 2017). We also observed that compliance decreased with a sharp initial drop and then during the study at a rate of 2% a day, which was in line with observations of comparatively large dropout rates in online studies (Frick, Bächtiger, & Reips, 2001; Galesic, 2006; Reips, 2002, 2007).
Response time
The distribution of response times was highly skewed, with a median of around 52 minutes and outliers who only responded to notifications the next day. This could be considered an exceptionally long time interval but can be explained in light of the student project framework. Both projects sent one notification per day, with instructions stating that participants should provide their evaluation of that day (therefore notifications were sent in the evening). The instructions did not emphasize that participants should respond as fast as possible, although this type of instruction might be important for studies using a different schedule (e.g., several times a day). Furthermore, we did not let notifications expire, so the surveys could be opened at any time after receiving the notification. We used the same web link for these recurring surveys, but other studies could use different links to better monitor the survey’s progress.
Response times did not differ significantly between the projects and did not change throughout their durations, suggesting that participants who stayed with the projects retained the same pattern of application usage.
Participants in both projects made use of the possibility to open notifications via the app interface. Displaying notifications in the app was the backup plan to ensure that participants who had accidentally deleted a notification (e.g., by swiping it to the left instead of tapping it) could still find the survey link in the app. We did not explicitly instruct participants to use this feature, but more users did so (62%) than by tapping a notification (33%). Response times were significantly different, with notifications being tapped on after an average of 21 minutes or being opened via the application after an average of 1 hour 24 minutes. This may indicate that in cases where participants delayed their response, they preferred to use the app to get the link to the survey later. However, it could also be that they missed a notification and found it later in the app. The fact that most participants (67%) tapped a notification at least once suggests that this notification method did work for them technically. However, because participants were using their own devices, we could not prevent them from turning off notifications or switching to a battery-saving mode. Such concerns are generic for all mobile apps and should be discussed between researchers and participants at the beginning of a study.
Device effects
The participants in both student projects used their own Android or iOS smartphone systems. The absence of any significant differences in compliance rates and response times between the operating systems indicated that the Samply Research mobile app operates similarly on either device.