Abstract
The claim that 100 ms system latency is fast enough for an optimal interaction with highly interactive computer systems has been challenged by several studies demonstrating that users are able to perceive latencies well below the 100 ms mark. Although a high amount of daily computer interactions is still characterized by mouse-based interaction, to date only few studies about latency perception thresholds have employed a corresponding interaction paradigm. Therefore, we determined latency perception thresholds in a mouse-based computer interaction task. We also tested whether user characteristics, such as experience with latency in computer interaction and interaction styles, might be related to inter-individual differences in latency perception thresholds, as results of previous studies indicate that there is considerable inter-individual variance in latency perception thresholds. Our results show that latency perception thresholds for a simple mouse-based computer interaction lie in the range of 60 ms and that inter-individual differences in latency perception can be related to user characteristics.
Keywords
- Latency
- System response time
- Human-computer interaction
- Mouse-based interaction
- Latency perception
Download conference paper PDF
1 Introduction
Optimizing system latency (the time delay between user input and the output response of a computer system [13, 16]) is a common challenge in any interactive computer system. System latency can degrade user experience [2, 18] and lead to a less efficient interaction with a given system [10, 11, 13, 17]. Consequently, specifying design goals in terms of acceptable latencies has been a fundamental research topic in human-computer interaction for several decades [4, 13, 14]. Ideally, an interactive computer system should mimic physical systems as closely as possible to allow for a fluid and natural interaction. Hence, an optimal computer system should have no subjectively perceptible time delay between system input (e.g., hand movements) on the one side and respective system output (e.g., cursor movement) on the other side. Determining this latency perception threshold (also termed just noticeable difference, JND [16]) means to find the system latency where users cannot distinguish between a system with and without additional latency anymore.
In the past, common latency design guidelines typically proposed the design goal of 100 ms system latency for an optimal interaction with highly interactive computer systems [5, 14]. However, recent research [6, 16] has shown that users can perceive much lower latencies (around 10 ms) when interacting with a touchscreen device (i.e., direct-touch interaction). Studies applying indirect input paradigms suggest that users can detect latencies as low as 50 ms when using a touchpad or a stylus [3, 6]. For mouse-based interaction, first research indicates that latencies below 100 ms can impair performance [10, 17]. While a high amount of daily computer interactions is still characterized by mouse-based interaction, there are only few studies about latency perception thresholds employing a corresponding interaction paradigm. Therefore, it appears relevant to validate these recent results with this prevalent interaction type.
Furthermore, previous studies have provided evidence for the existence of considerable inter-individual differences in latency perception thresholds. Annett and colleagues [3] found latency perception thresholds in the range of 30–80 ms and 60–105 ms for stylus-based tasks (drawing and writing). An even greater range of 20–100 ms was reported by Jota, Ng, Dietz, and Wigdor [11] for a tapping task where participants had to evaluate the time delay between tapping a touch display and the appearance of a rectangle. Explaining this variability of latency perception thresholds found across users in different tasks should be informative for latency perception studies as well as latency design guidelines, as the average latency perception threshold of a population is of relatively little value when the population variance is high. One way of explaining this variance might be its relation to inter-individual differences in user characteristics. Two classes of variables of particular interest in this case might be the previous experience with latency in computer interaction [20], and inter-individual differences in interaction styles [19].
The objective of the present research was twofold – first, we aimed to determine the magnitude of latency perception thresholds in a mouse-based human-computer interaction task. Second, we intended to explore factors which may lead to inter-individual differences in these identified latency perception thresholds.
2 Background
2.1 Perception of System Latency
Echoing Annett and colleagues [3]: “From a psychological and interaction perspective, it is imperative to understand the processes governing latency perception before recommendations for future systems are made” (p. 173). One way of shedding light on the underlying processes of latency perception is the comparison of latency perception thresholds across different tasks. In experimental tasks where participants solely relied on the temporal offset between an input and an output (e.g., tapping on a touch screen and waiting for a change of the display), latency perception thresholds have been found to be considerably higher compared to tasks where latencies lead to changes on more salient dimensions, such as the spatial offset between finger position and cursor position in direct-touch interaction [16]. Abstracting from this, perceiving relatively low system latencies seems to require the comparison of the input state and the output state of a computer system along dimensions influenced by system latency that are readily perceivable (e.g., spatial information in contrast to temporal information).
The spatial offset between the input device and the cursor in direct-touch pointing or dragging tasks which is caused by system latency, can be directly processed in the visual system [15, 16]. Compared to this, the perception of latency in mouse-based pointing or dragging tasks should require a more complex comparison. This is because information about the input and the output are distributed across different perceptual modalities (somatosensory system and visual system respectively) and have to be transformed before being compared. More distributed representations of the input state and the output state should therefore hinder their comparison, effectively leading to higher latency perception thresholds. Indeed, Annett and colleagues [3] found that making visual information about the input state (but not the output state) of a system unavailable increased latency perception thresholds in an inking task.
While the available information about the input and the output state of a system fundamentally shape how latency is perceived, there are other factors which should also determine if a user will perceive latency when performing a given task. Following Annett and colleagues’ [3] latency perception model, two main factors influencing latency perception are contextual demands (e.g., task requirements, environmental factors) and the observer of the system. Observer characteristics relevant for latency perception may include experience of how latency manifests in computer interactions, as well as practice with tasks highly sensitive to latency. If the observer of the system also provides the input of the system, characteristics of the interaction, such as movement speed of the input device controlled by the user, should be of importance for latency perception, as they may lead to a higher saliency of the system latency. Hence, differences in movement speed and experience with tasks where latency detection is relevant may be used to explain inter-individual variance of latency perception thresholds.
2.2 Movement Speed
Higher movement speeds in dragging or pointing tasks cause a larger spatial offset between the input device and the cursor when latency is present, because the input device travels a farther distance before the cursor position is updated and this change is displayed [15, 16]. In direct-touch dragging tasks the spatial offset between the finger and the center of the dragged object is still perceivable at latencies as low as 10 ms when moving at a moderate pace [15]. This means that even at very low latencies an increase in movement speed should increase the spatial offset between the hand and the cursor and therefore make it more likely that users will perceive latency. This relationship between movement speed and latency perception should be the same for mouse-based interaction, albeit being less pronounced, because of the more complex comparison of mouse position (i.e., input state) and cursor position (i.e., output state). While direct-touch dragging allows to continuously compare finger position and cursor position, evaluating the simultaneousness between mouse position and cursor position may only be possible when initiating or changing the direction of a movement (i.e., when the cursor has once started to follow a movement vector of the hand it is difficult to estimate the displacement distance to the position of the mouse).
2.3 Experience with Latency in Computer Interaction
Apart from comparing input state and output state of a system, users might also draw from their experience about how latency manifests in the computer interaction to compare the actual output state of the system with an expected output state (i.e., if there was no perceptible system latency). A common activity where users might gather experience about the impact of system latency and practice latency detection are highly dynamic computer games, such as action games, racing games, or first person shooter games. Because these games require very fast and precise reactions depending on the output state of the system, they are especially susceptible to system latency, so even small system latencies can impair performance. This is also a reason why these games are a relevant research subject for studying the influence of latency on user performance [10].
2.4 Research Questions and Hypotheses
The purpose of the present study was to answer two questions – first, we aimed to determine the magnitude of latency perception thresholds in a mouse-based human-computer interaction task.
Q1: Of which magnitude are the latency perception thresholds in a mouse-based human-computer interaction task?
Our second research question concerning the inter-individual differences of latency perception thresholds was:
Q2: Are there user characteristics related to inter-individual differences in latency perception thresholds?
In respect of the second research question we hypothesized that higher movement speeds should lead to a higher discrepancy between expected and actual mouse cursor position and therefore to a higher perceptibility of latency, resulting in a lower latency perception threshold of individuals with higher average mouse movement speeds.
H1: Higher mouse movement speeds are related to lower latency perception thresholds.
We also expected individuals with a higher exposure to highly dynamic computer games, such as action games, racing games, or first person shooter games, to exhibit lower latency perception thresholds, as they might have a higher sensitivity towards a mismatch of actual and predicted mouse cursor position.
H2: More experience with highly dynamic computer games is related to lower latency perception thresholds.
3 Method
3.1 Participants
Twenty students (10 female, age 19–36 years, M = 23.45, SD = 3.32) which were recruited via the local psychology student mailing list took part in the experiment. All participants had normal or corrected-to-normal vision and normally used their right hand for handling computer mice. Participants signed an informed consent sheet at the beginning of the experiment and received partial course credit for participation.
3.2 Experimental Setup
Procedure.
Participants were asked to complete a mouse-based dragging task on a computer screen. The task was to move a grey square representing the mouse cursor from the left side of the display to the right and back again without pausing. The left and right target areas were each indicated by two short dashes at the bottom and the top of the display (see Fig. 1). Both target areas covered 20% of the display. Moreover, participants were allowed to touch the left and right edges of the screen. Therefore, no precise dragging was necessary to perform the task which means that the task difficulty (i.e. index of difficulty according to Fitts’ Law [8]) and only a relatively low level of coordination was required (i.e., low workload for action control).
To determine participants’ latency perception thresholds we used an adaptive threshold estimation approach (ZEST – for more details regarding this method see section “Scales and Measures”) in a two-alternative forced-choice discrimination task (2AFC task). Hence, each trial consisted of two subtrials showing (a) the reference system with baseline latency and (b) the probe system with additional latency ranging between 1 and 300 ms. The systems were presented in a randomized order. After finishing both subtrials, participants were asked to indicate in which subtrial the system reacted instantaneously, that is without additional latency. The latency perception threshold was defined as the additional latency of the probe system where participants were able to correctly distinguish between the reference system and the probe system 75% of the time. This is a commonly accepted perception threshold value for 2AFC tasks lying between the maximum hit rate of 100% and the baseline hit rate of 50% (the guessing probability when performing at chance level) [16, 21].
To ensure that participants were familiar with the task and understood how added latency becomes apparent, they completed a short training with a fixed and relatively high probe latency of 200 ms. The training ended when five trials in a row were answered correctly. It took participants five to eight trials to finish the training. The experimental trials where divided into two separate ZEST runs providing two perception threshold estimates for each participant. Each ZEST run consisted of 30 trials, except for the first three participants who took part in the study where the number of ZEST trials was still set to 20 trials. While this is a small inconsistency in our methodology that should be considered in interpreting our results, the effect on the results should be small and should, at the most, only lead to a somewhat lower precision of our estimates (i.e., instead of biasing results in a certain direction). However, it also has to be noted that there is no linear relationship between the number of ZEST trials and reliability of determining the latency threshold (i.e., the effect of increasing the number of ZEST trials from 20 to 30 is relatively small).
Hardware and Software.
The hardware setup consisted of a computer with a 3.3 GHz processor (Intel(R)Core™ i7-5820k), a mouse with a polling rate of 1000 Hz and a sensor resolution set to 800 dpi (Logitech G303), and a 24-inch-monitor with a refresh rate of 144 Hz (Acer XF240H). The experimental task was implemented in C++ using the open-source libraries of SFML (www.sfml-dev.org) for 2D visual displays. System latency was manipulated by delaying the update of the mouse cursor position for a fixed amount of time throughout the probe subtrials.
System latency.
To measure the latency of the computer system we recorded the mouse and the monitor with a 1000 Hz high speed camera (Sony RX100IV) while running the program of the experimental task and initializing a sharp movement of the mouse with a solid object [10]. The latency of the system was then calculated from the number of frames elapsed between the start of the mouse movement and a first update of the mouse cursor position. We measured the baseline system latency and the system latency with additional latency to assess the accuracy of the latency manipulation ten times each. The mean baseline latency of the system amounted to 8 ms (SD = 1.7 ms). Adding 200 ms latency through the software resulted in a mean system latency of 207 ms (SD = 1.5 ms) differing only slightly from the expected system latency of 208 ms when considering the baseline of 8 ms.
3.3 Scales and Measures
Estimation of Latency Perception Thresholds.
The Bayesian threshold estimation approach ZEST [12] allows to perform exact threshold estimations within as little as 20 trials [1]. Starting from a hypothetical prior distribution of perception thresholds, this procedure estimates the perception threshold (and therefore the optimal latency of the probe to test for this threshold) based on the participant’s performance in each prior trial. This means that a correct (incorrect) answer at a given latency level leads to a decrease (increase) of the participant’s estimated perception threshold which in turn is used as the latency level of the probe for the next trial. The update of the prior distribution is done via multiplication with a likelihood function which gives the probability of a correct (or incorrect) answer depending on the participant’s current estimated perception threshold and the latency of the probe [1].
We used the Weibull function as our likelihood function, which is also dependent on the fixed parameters for the guessing rate (γ), the lapsing rate (λ), the slope factor (β) and the parameter ε. Because we used a 2AFC task, γ was set to .50 (the hit rate when performing at the level of chance), λ was set to .03, β was set to 3.5 as suggested in [22] and ε was set to −0.03 (following Eq. 4 from [1]). The prior distribution was a normal distribution on a log scale centered on 2 with a standard deviation of 0.7 – resulting in a prior distribution mean of 100 ms (10 to the power of 2) and the inclusion of thresholds from 20 ms (≈10 to the power of 1.3) up to 500 ms (≈10 to the power of 2.7) within ±1 SD of the mean. The upper bound for the probe latency for each trial was set to 300 ms, because previous studies indicate that users should readily perceive system latencies of this magnitude [3, 6] and higher system latencies are not representative for normal computer systems [10].
Movement speed.
We measured participants’ movement speed in two ways. The first measure was the average movement speed across all probe subtrials, the second measure was the average covered distance after the first 100 ms of movement for each probe subtrial focusing on the first acceleration of each subtrial. Both measures were computed from a log file of the cursor position on a millisecond time scale. We included the second measure to focus on the first displacement of the mouse cursor, where we assumed additional latency to be best visible, as well as to control for the possibility that some participants might move slower in certain sections, which would in turn systematically decrease their average movement speed.
Experience with highly dynamic computer games.
To assess the amount of the participants’ practice with highly dynamic computer games we asked for the time they typically spent playing highly dynamic computer games, such as action games, racing games, or first person shooter games within a week. We asked this question with regard to the past six months (six month experience), as well as the period in the participants’ lives when they were most actively playing these games (lifetime experience).
4 Results
4.1 Magnitude of Latency Perception Thresholds
With regard to our first research question (Q1) we computed measures of central tendency and dispersion of participants’ latency perception thresholds. Because the retest reliability of the two latency perception threshold estimates of each participant was high (r = .87), their mean was used for subsequent analyses. The latency perception thresholds’ range was 34–137 ms with a mean of 65 ms (Median = 54 ms) and a standard deviation of 30 ms. The latency perception threshold distribution was right-skewed and a Shapiro-Wilk test indicated that the latency perception thresholds were non-normally distributed (W(20) = 0.88, p = .021).
4.2 Inter-individual Differences Related to Latency Perception Thresholds
To address our second research question (Q2) we aimed to quantify the strength of the associations between latency perception thresholds and average movement speed, average covered distance after 100 ms, as well as six month, and lifetime experience with highly dynamic computer games. We decided to not analyze the six month experience with highly dynamic computer games, because 60% of the participants did not play any of these games in the past six months causing a substantial restriction of variance for this variable. The percentage of participants without any lifetime experience with highly dynamic computer games was only 30%. The average time spent playing highly dynamic computer games within a week in the period of the participants’ lives where they were most actively playing these games was 10.25 h (Median = 4, SD = 12.73). Because of the non-normality of the latency perception threshold distribution we used non-parametric Spearman correlations.
To test our first hypothesis (H1) we computed the correlation between latency perception thresholds and average movement speeds, as well as average covered distance after 100 ms. Both higher average movement speeds and higher average covered distance after 100 ms were associated with lower latency perception thresholds (r S = −.23, r S = −.35 respectively), but the correlations did not reach statistical significance (p > .05).
As proposed in our second hypothesis (H2), a higher lifetime experience with highly dynamic computer games significantly correlated with participants’ latency perception threshold values (r S = −.42, p = .031). To further illustrate the effect of the lifetime experience with highly dynamic computer games we split the sample into two groups along the median of this variable. The mean latency perception thresholds of the groups with a high and low lifetime experience with highly dynamic computer games were M High Exp. = 52 ms (SD High Exp. = 6 ms), and M Low Exp. = 78 ms (SD Low Exp. = 11 ms) respectively.
5 Discussion
Our results indicate that on average users are able to perceive latencies around 60 ms (combining the median latency perception threshold and our baseline system latency) in mouse-based interaction tasks. We also found considerable inter-individual differences in latency perception thresholds in our sample with some participants reaching or even surpassing the 100 ms mark. These results come close to values reported in recent studies where participants performed other forms of indirect input tasks [3, 6, 11]. Furthermore, we found that observer characteristics can play a role in latency perception, as the inter-individual differences in latency perception thresholds were related to the experience with highly dynamic computer games, such as action games, racing games, or first person shooter games. The average latency perception threshold of the group with little prior experience with highly dynamic computer games was about 25 ms higher than the thresholds of the high-experience group.
There was also a tendency for participants with higher movement speeds to have lower latency perception thresholds, but the effect was too small to reach statistical significance. While others have argued that an increased spatial offset between input device and cursor caused by higher movement speeds does not increase the saliency of latency [3], we would not rule out this possibility given our results and with our small sample size in mind, which restricted the power to detect such a relatively small effect. Albeit the difference between our two measures of average movement speed is small, we found a greater effect of movement speed on latency perception thresholds for the measure of covered distance after 100 ms, which only included information about the first left-to-right movement in our experiment. We would argue that there might be types of movements or specific movement phases (e.g., movement onset, direction changes) where movement speed could be relevant for latency perception, but studies with a higher power would be needed to detect this effect.
5.1 Practical and Theoretical Implications
The most important practical implication of the present study might be that latency perception thresholds can be associated with user characteristics. This means that when measuring latency perception thresholds or designing a computer system based on the results of these measurements, one should also consider the user characteristics of the sample the measurements were conducted with, the user characteristics of the target audience, as well as the fit of these two groups with respect to their relevant user characteristics.
While our results give support to the notion that user characteristics play an important role in latency perception in principle, our experiment does not allow to pinpoint the mechanisms that caused this effect. We theorized that the experience with highly dynamic computer games should make it more likely for users to perceive lower latencies, as these games implicitly provide the user with some information about how latencies manifest in computer interaction. Other mechanisms possibly mediating the relationship between experience with highly dynamic computer games and latency perception thresholds might be the participants‘motivation to perform well in a computer interaction task, aspects of the interaction style apart from pure movement speed, or cognitive abilities, such as spatial perception.
Another confounding variable could be the latency of the computer systems participants interact with on a day-to-day basis. As highly dynamic computer games have higher hardware requirements compared to normal office software or web browsers, users who play (or played) these games also might own a faster computer system to match these hardware requirements. It is possible that participants of our study might have referred to their usual latency experience as a baseline when doing the 2AFC task. It also has to be noted that our study only provided correlational data which does not necessarily imply a strong causality between experience with highly dynamic computer games and latency perception thresholds.
5.2 Future Research
Firmly establishing the connection of latency perception and user characteristics cannot be achieved with a single study, especially when taking into account the relatively small sample size of the present one. Therefore, subsequent studies are needed to test this relationship and possible mediators, such as mentioned above. Apart from probing the relationship of experience with highly dynamic computer games, subsequent studies might also focus on gathering more data about interaction styles, such as movement speed or the role of strategies for latency compensation, which can also involve the adaptation of movement speeds [7, 9]. Yet another avenue for future research might be the influence of user’s normal latency experience when interacting with their own computer systems. As was pointed out before [16, 20], user’s expectations about how much latency a system should exhibit are highly influenced by the technical standard they gather their experiences with, which is continuously improving. These expectations could also be related to user’s latency perception or effect it indirectly through altered interaction styles.
6 Conclusion
The results of our study provide some new insights for the understanding of latency perception. First, we found that latency perception thresholds for a simple mouse-based task lie well below 100 ms, similar to the results of studies about latency perception thresholds for other indirect input tasks [3, 6, 11]. However this does not necessarily mean that users will perceive latencies below 100 ms or find them disturbing [3]. In the light of the available information about latency perception thresholds the recommendation of 100 ms system latency [5, 14] should nevertheless be scrutinized in further studies [4]. Second, we demonstrated that inter-individual differences in latency perception are related to user characteristics, in this case the experience with highly dynamic computer games. We also argued that interaction characteristics such as movement speed might play a role in latency perception, although further studies are needed to investigate this effect. Together these findings open up many possibilities for future research about latency perception, particularly the role of user characteristics and interaction styles.
References
Alcala-Quintana, R., Garcia-Perez, M.A.: The role of parametric assumptions in adaptive Bayesian estimation. Psychol. Methods 9, 250–271 (2004). doi:10.1037/1082-989X.9.2.250
Anderson, G., Doherty, R., Ganapathy, S.: User perception of touch screen latency. In: Marcus, A. (ed.) DUXU 2011. LNCS, vol. 6769, pp. 195–202. Springer, Heidelberg (2011). doi:10.1007/978-3-642-21675-6_23
Annett, M., Ng, A., Dietz, P.H., Bischof, W.F., Gupta, A.: How low should we go? Understanding the perception of latency while inking. In: Proceedings of Graphics Interface 2014, pp. 167–174. Canadian Information Processing Society (2014)
Attig, C., Rauh, N., Franke, T., Krems, J.F.: System latency guidelines then and now – is zero latency really considered necessary? In: Paper Presented at HCI International 2017 (2017)
Card, S.K., Robertson, G.G., Mackinlay, J.D.: The information visualizer, an information workspace. In: CHI 1991 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 181–186. ACM (1991). doi:10.1145/108844.108874
Deber, J., Jota, R., Forlines, C., Wigdor, D.: How much faster is fast enough? User perception of latency & latency improvements in direct and indirect touch. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1827–1836. ACM (2015). doi:10.1145/2702123.2702300
de la Malla, C., Lopez-Moliner, J., Brenner, E.: Dealing with delays does not transfer across sensorimotor tasks. J. Vis. 14, 8 (2014). doi:10.1167/14.12.8
Fitts, P.M.: The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 47, 381–391 (1954). doi:10.1037/h0055392
Honda, T., Hirashima, M., Nozaki, D.: Adaptation to visual feedback delay influences visuomotor learning. PLoS ONE 7, e37900 (2012). doi:10.1371/journal.pone.0037900
Ivkovic, Z., Stavness, I., Gutwin, C., Sutcliffe, S.: Quantifying and mitigating the negative effects of local latencies on aiming in 3d shooter games. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 135–144. ACM (2015). doi:10.1145/2702123.2702432
Jota, R., Ng, A., Dietz, P.H., Wigdor, D.: How fast is fast enough? A study of the effects of latency in direct-touch pointing tasks. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2291–2300. ACM (2013). doi:10.1145/2470654.2481317
King-Smith, P.E., Grigsby, S.S., Vingrys, A.J., Benes, S.C., Supowit, A.: Efficient and unbiased modifications of the QUEST threshold method: theory, simulations, experimental evaluation and practical implementation. Vis. Res. 34, 885–912 (1994)
Mackenzie, I.S., Ware, C.: Lag as a determinant of human performance in interactive systems. In: Proceedings of the INTERACT 1993 and CHI 1993 Conference on Human Factors in Computing Systems. pp. 488–493. ACM (1993). doi:10.1145/169059.169431
Miller, R.B.: Response time in man-computer conversational transactions. In: Proceedings of the December 9–11, 1968, Fall Joint Computer Conference, Part I, pp. 267–277. ACM (1968). doi:10.1145/1476589.1476628
Ng, A., Dietz, P.H.: The effects of latency and motion blur on touch screen user experience. J. Soc. Inform. Display 22, 449–456 (2014). doi:10.1002/jsid.243
Ng, A., Lepinski, J., Wigdor, D., Sanders, S., Dietz, P.H.: Designing for low-latency direct-touch input. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 453–464. ACM (2012). doi:10.1145/2380116.2380174
Pavlovych, A., Gutwin, C.: Assessing target acquisition and tracking performance for complex moving targets in the presence of latency and jitter. In: Proceedings of Graphics Interface 2012, pp. 109–116. Canadian Information Processing Society (2012)
Potter, J.J., Singhose, W.E.: Effects of input shaping on manual control of flexible and time-delayed systems. Hum. Factors 56, 1284–1295 (2014). doi:10.1177/0018720814528004
Seow, S.C.: Designing and Engineering Time: the Psychology of Time Perception in Software. Pearson Education, Boston (2008)
Shneiderman, B., Plaisant, C.: Designing the User Interface: Strategies for Effective Human-Computer Interaction. Pearson, Boston (1987)
Ulrich, R., Vorberg, D.: Estimating the difference limen in 2AFC tasks: pitfalls and improved estimators. Atten. Percept. Psychophys. 71, 1219–1227 (2009). doi:10.3758/APP.71.6.1219
Watson, A.B., Pelli, D.G.: QUEST: a Bayesian adaptive psychometric method. Atten. Percept. Psychophys. 33, 113–120 (1983). doi:10.3758/BF03202828
Acknowledgements
This research was funded by the German Federal Ministry of Education and Research (03ZZ0504H) in the context of the project fast-realtime. Statements in this paper reflect the authors’ views and do not necessarily reflect those of the funding body or of the project partners.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Forch, V., Franke, T., Rauh, N., Krems, J.F. (2017). Are 100 ms Fast Enough? Characterizing Latency Perception Thresholds in Mouse-Based Interaction. In: Harris, D. (eds) Engineering Psychology and Cognitive Ergonomics: Cognition and Design. EPCE 2017. Lecture Notes in Computer Science(), vol 10276. Springer, Cham. https://doi.org/10.1007/978-3-319-58475-1_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-58475-1_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58474-4
Online ISBN: 978-3-319-58475-1
eBook Packages: Computer ScienceComputer Science (R0)