Advertisement

Web Performance Pitfalls

  • Theresa EnghardtEmail author
  • Thomas Zinner
  • Anja Feldmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11419)

Abstract

Web performance is widely studied in terms of load times, numbers of objects, object sizes, and total page sizes. However, for all these metrics, there are various definitions, data sources, and measurement tools. These often lead to different results and almost all studies do not provide sufficient details about the definition of metrics and the data sources they use. This hinders reproducibility as well as comparability of the results. This paper revisits the various definitions and quantifies their impact on performance results. To do so we assess Web metrics across a large variety of Web pages.

Amazingly, even for such “obvious” metrics as load times, differences can be huge. For example, for more than 50% of the pages, the load times vary by more than 19.1% and for 10% by more than 47% depending on the exact definition of load time. Among the main culprits for such difference are the in-/exclusion of initial redirects and the choice of data source, e.g., Resource Timings API or HTTP Archive (HAR) files. Even “simpler” metrics such as the number of objects per page have a huge variance. For the Alexa 1000, we observed a difference of more than 67 objects for 10% of the pages with a median of 7 objects. This highlights the importance of precisely specifying all metrics including how and from which data source they are computed.

Keywords

Web performance Measurement 

Notes

Acknowledgements

Thanks to Dominik Strohmeier for the discussion and the pointers to resources, to our shepherd Jelena Mirkovic, as well as our anonymous reviewers.

References

  1. 1.
    Bocchi, E., De Cicco, L., Rossi, D.: Measuring the quality of experience of web users. In: ACM SIGCOMM Computer Communication Review, vol. 46, no. 4, pp. 8–13 (2016)Google Scholar
  2. 2.
    W3C Recommendation: Navigation Timing. Version 17 December 2012. https://www.w3.org/TR/navigation-timing/. Accessed 29 Aug 2018
  3. 3.
    W3C Working Draft: Navigation Timing Level 2. Version 30 November 2018. https://www.w3.org/TR/2018/WD-navigation-timing-2-20181130/. Accessed 17 Dec 2018
  4. 4.
    W3C Candidate Recommendation: Resource Timing Level 1. Version 30 March 2017. https://www.w3.org/TR/resource-timing-1/. Accessed 29 Aug 2018
  5. 5.
    W3C Working Draft: Resource Timing Level 2. Version 11 October 2018. https://www.w3.org/TR/resource-timing-2/. Accessed 13 Oct 2018
  6. 6.
    W3C First Public Working Draft: Paint Timing 1. Version 07 September 2017. https://www.w3.org/TR/paint-timing/. Accessed 10 Oct 2018
  7. 7.
    W3C Editor’s Draft: HTTP Archive (HAR) format. Version 14 August 2012. https://w3c.github.io/web-performance/specs/HAR/Overview.html. Accessed 29 Aug 2018
  8. 8.
    Bruns, A., Kornstadt, A., Wichmann, D.: Web application tests with selenium. IEEE Softw. 26(5), 88–91 (2009)CrossRefGoogle Scholar
  9. 9.
    Selenium Documentation: Worst Practices. https://seleniumhq.github.io/docs/worst.html. Accessed 29 Aug 2018
  10. 10.
    Meenan, P.: WebPageTest. https://www.webpagetest.org. Accessed 17 Dec 2018
  11. 11.
    da Hora, D.N., Asrese, A.S., Christophides, V., Teixeira, R., Rossi, D.: Narrowing the gap between QoS metrics and web QoE using above-the-fold metrics. In: Beverly, R., Smaragdakis, G., Feldmann, A. (eds.) PAM 2018. LNCS, vol. 10771, pp. 31–43. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-76481-8_3CrossRefGoogle Scholar
  12. 12.
    Goel, U., Steiner, M., Wittie, M.P., Flack, M., Ludin, S.: Measuring what is not ours: a tale of \(3^{\text{rd}}\) party performance. In: Kaafar, M.A., Uhlig, S., Amann, J. (eds.) PAM 2017. LNCS, vol. 10176, pp. 142–155. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-54328-4_11
  13. 13.
    Erman, J., Gopalakrishnan, V., Jana, R., Ramakrishnan, K.K.: Towards a SPDY’ier mobile web? IEEE/ACM Trans. Netw. 23(6), 2010–2023 (2015)CrossRefGoogle Scholar
  14. 14.
    Qian, F., Gopalakrishnan, V., Halepovic, E., Sen, S., Spatscheck, O.: TM 3: flexible transport-layer multi-pipe multiplexing middlebox without head-of-line blocking. In: Proceedings of the 11th ACM Conference on Emerging Networking Experiments and Technologies, p. 3. ACM, New York (2015)Google Scholar
  15. 15.
    Wang, X.S., Krishnamurthy, A., Wetherall, D.: Speeding up web page loads with Shandian. In: Proceedings of the 13th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2016), pp. 109–122. USENIX Association (2016)Google Scholar
  16. 16.
    Wang, X.S., Balasubramanian, A., Krishnamurthy, A., Wetherall, D.: Demystifying page load performance with WProf. In: NSDI 2013, pp. 473–485 (2013)Google Scholar
  17. 17.
    Butkiewicz, M., Madhyastha, H.V., Sekar, V.: Understanding website complexity: measurements, metrics, and implications. In: Proceedings of the 2011 ACM SIGCOMM Conference on Internet Measurement Conference, pp. 313–328. ACM, New York (2011)Google Scholar
  18. 18.
    Kelton, C., Ryoo, J., Balasubramanian, A., Das, S.R.: Improving user perceived page load times using gaze. In: Proceedings of the 14th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2017), pp. 545–559. USENIX Association (2017)Google Scholar
  19. 19.
    Varvello, M., Schomp, K., Naylor, D., Blackburn, J., Finamore, A., Papagiannaki, K.: Is the web HTTP/2 yet? In: Karagiannis, T., Dimitropoulos, X. (eds.) PAM 2016. LNCS, vol. 9631, pp. 218–232. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-30505-9_17CrossRefGoogle Scholar
  20. 20.
    Netravali, R., Mickens, J.: Prophecy: accelerating mobile page loads using final-state write logs. In: Proceedings of the 15th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2018). USENIX Association (2018)Google Scholar
  21. 21.
    Netravali, R., Nathan, V., Mickens, J., Balakrishnan, H.: Vesper: measuring time-to-interactivity for web pages. In: Proceedings of the 15th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2018). USENIX Association (2018)Google Scholar
  22. 22.
    Netravali, R., Goyal, A., Mickens, J., and Balakrishnan, H.: Polaris: faster page loads using fine-grained dependency tracking. In: Proceedings of the 13th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2016). USENIX Association (2016)Google Scholar
  23. 23.
    Zaki, Y., Chen, J., Pötsch, T., Ahmad, T., Subramanian, L.: Dissecting web latency in Ghana. In: Proceedings of the 2014 Conference on Internet Measurement Conference, pp. 241–248. ACM, New York (2014)Google Scholar
  24. 24.
    Han, B., Qian, F., Hao, S., Ji, L.: An anatomy of mobile web performance over multipath TCP. In: Proceedings of the 11th ACM Conference on Emerging Networking Experiments and Technologies, p. 5. ACM, New York (2015)Google Scholar
  25. 25.
    Naylor, D., et al.: The cost of the S in HTTPS. In: Proceedings of the 10th ACM International on Conference on Emerging Networking Experiments and Technologies, pp. 133–140. ACM, New York (2014)Google Scholar
  26. 26.
    Scheitle, Q., et al.: A long way to the top: significance, structure, and stability of internet top lists. In: Internet Measurement Conference 2018. ACM, New York (2018)Google Scholar
  27. 27.
    Let’s Encrypt: Percentage of Web Pages Loaded by Firefox Using HTTPS. https://letsencrypt.org/stats/#percent-pageloads. Accessed 30 Sept 2018
  28. 28.
    Egger, S., Hossfeld, T., Schatz, R., Fiedler, M.: Waiting times in quality of experience for web based services. In: 2012 Fourth International Workshop on Quality of Multimedia Experience (QoMEX), pp. 86–96. IEEE (2012)Google Scholar
  29. 29.
    Barth, A.: The web origin concept. RFC 6454 (2011)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Theresa Enghardt
    • 1
    Email author
  • Thomas Zinner
    • 1
  • Anja Feldmann
    • 2
  1. 1.TU BerlinBerlinGermany
  2. 2.Max-Planck Institute for InformaticsSaarbrückenGermany

Personalised recommendations