Abstract
Predicting the time at which the integral over a stochastic process reaches a target level is a value of interest in many applications. Often, such computations have to be made at low cost, in real time. As an intuitive example that captures many features of this problem class, we choose progress bars, a ubiquitous element of computer user interfaces. These predictors are usually based on simple point estimators, with no error modelling. This leads to fluctuating behaviour confusing to the user. It also does not provide a distribution prediction (risk values), which are crucial for many other application areas. We construct and empirically evaluate a fast, constant cost algorithm using a Gauss-Markov process model which provides more information to the user.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The expositions in this section could also be formulated more generally (for Matérn-class covariances) in the framework of state-space models and associated filters [11]. The derivations here only work for our specific choice of the Ornstein-Uhlenbeck kernel (the first member of the Matérn class), but they allow a more straightforward treatment of the uncertainty on the parametric mean.
- 2.
- 3.
Dataset available at http://sidekick.epfl.ch/data.
- 4.
References
Ajne, B., Daleniua, T.: Några tillämpningar av statistika ideer på numerisk integration. Nordisk Math. Tidskrift 8, 145–152 (1960)
Bishop, C.: Pattern Recognition and Machine Learning. Springer, Berlin (2006)
Diaconis, P.: Bayesian numerical analysis. Stat. Decis. Theory Relat. Top. IV 1, 163–175 (1998)
Etter, V., Grossglauser, M., Thiran, P.: Launch hard or go home!: predicting the success of kickstarter campaigns. In: Proceedings of the First ACM Conference on Online Social Networks, COSN ’13, pp. 177–182. ACM, New York (2013)
Garnett, R., Osborne, M., Hennig, P.: Active learning of linear embeddings for Gaussian processes. In: Uncertainty in Artificial Intelligence (2014)
Minka, T.: Deriving quadrature rules from Gaussian processes. Technical report, Statistics Department, Carnegie Mellon University (2000)
Osborne, M., Duvenaud, D., Garnett, R., Rasmussen, C., Roberts, S., Ghahramani, Z.: Active learning of model evidence using bayesian quadrature. In: Advances in NIPS, pp. 46–54 (2012)
Peres, S., Kortum, P., Stallmann, K.: Auditory progress bars: preference, performance and aesthetics. In: Proceedings of the 13th International Conference on Auditory Display, Montreal, Canada, 26–29 June 2007
Rasmussen, C., Williams, C.: Gaussian Processes for Machine Learning. MIT, Cambridge (2006)
Rogers, L., Williams, D.: Diffusions, Markov Processes and Martingales, vol. 1: Foundations, 2nd edn. Cambridge University Press, Cambridge (2000)
Särkkä, S.: Bayesian Filtering and Smoothing, vol. 3. Cambridge University Press, Cambridge (2013)
Snelson, E., Rasmussen, C., Ghahramani, Z.: Warped Gaussian processes. In: Advances in Neural Information Processing Systems (2004)
Uhlenbeck, G., Ornstein, L.: On the theory of the brownian motion. Phys. Rev. 36(5), 823 (1930)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Kiefel, M., Schuler, C., Hennig, P. (2014). Probabilistic Progress Bars. In: Jiang, X., Hornegger, J., Koch, R. (eds) Pattern Recognition. GCPR 2014. Lecture Notes in Computer Science(), vol 8753. Springer, Cham. https://doi.org/10.1007/978-3-319-11752-2_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-11752-2_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11751-5
Online ISBN: 978-3-319-11752-2
eBook Packages: Computer ScienceComputer Science (R0)