How Unique Is Your Web Browser?

  • Peter Eckersley
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6205)


We investigate the degree to which modern web browsers are subject to “device fingerprinting” via the version and configuration information that they will transmit to websites upon request. We implemented one possible fingerprinting algorithm, and collected these fingerprints from a large sample of browsers that visited our test side, . We observe that the distribution of our fingerprint contains at least 18.1 bits of entropy, meaning that if we pick a browser at random, at best we expect that only one in 286,777 other browsers will share its fingerprint. Among browsers that support Flash or Java, the situation is worse, with the average browser carrying at least 18.8 bits of identifying information. 94.2% of browsers with Flash or Java were unique in our sample.

By observing returning visitors, we estimate how rapidly browser fingerprints might change over time. In our sample, fingerprints changed quite rapidly, but even a simple heuristic was usually able to guess when a fingerprint was an “upgraded” version of a previously observed browser’s fingerprint, with 99.1% of guesses correct and a false positive rate of only 0.86%.

We discuss what privacy threat browser fingerprinting poses in practice, and what countermeasures may be appropriate to prevent it. There is a tradeoff between protection against fingerprintability and certain kinds of debuggability, which in current browsers is weighted heavily against privacy. Paradoxically, anti-fingerprinting privacy technologies can be self-defeating if they are not used by a sufficient number of people; we show that some privacy measures currently fall victim to this paradox, but others do not.


User Agent Privacy Enhance Technology Sensor Pattern Noise Current Browser Desktop Browser 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lukáš, J., Fridrich, J., Goljan, M.: Digital camera identification from sensor pattern noise. IEEE Transactions on Information Forensics and Security 1(2), 205–214 (2006)CrossRefGoogle Scholar
  2. 2.
    Kai San Choi, E.Y.L., Wong, K.K.: Source Camera Identification Using Footprints from Lens Aberration. In: Proc. of SPIE-IS&T Electronic Imaging, vol. 6069. SPIE (2006)Google Scholar
  3. 3.
    Hilton, O.: The Complexities of Identifying the Modern Typewriter. Journal of Forensic Sciences 17(2) (1972)Google Scholar
  4. 4.
    Kohno, T., Broido, A., Claffy, K.: Remote Physical Device Fingerprinting. IEEE Transactions on Dependable and Secure Computing 2(2), 108 (2005)CrossRefGoogle Scholar
  5. 5.
    Murdoch, S.: Hot or not: Revealing hidden services by their clock skew. In: Proc. 13th ACM conference on Computer and Communications Security, p. 36. ACM, New York (2006)Google Scholar
  6. 6.
    The 41st Parameter: PCPrintTM (2008),
  7. 7.
    Mills, E.: Device identification in online banking is privacy threat, expert says. CNET News (April 2009)Google Scholar
  8. 8.
    Mayer, J.: Any person... a pamphleteer: Internet Anonymity in the Age of Web 2.0. Undergraduate Senior Thesis, Princeton University (2009)Google Scholar
  9. 9.
    Krishnamurthy, B., Wills, C.: Generating a privacy footprint on the Internet. In: Proc. ACM SIGCOMM Internet Measurement Conference. ACM, New York (2006)Google Scholar
  10. 10.
    McKinkley, K.: Cleaning Up After Cookies. iSec Partners White Paper (2008)Google Scholar
  11. 11.
    Pool, M.B.: Meantime: non-consensual HTTP user tracking using caches (2000),
  12. 12.
    Soltani, A., Canty, S., Mayo, Q., Thomas, L., Hoofnagle, C.: Flash Cookies and Privacy. SSRN preprint (August 2009),
  13. 13.
    Robinson, S.: Flipping Typical, demonstration of CSS font detection (2009),
  14. 14.
  15. 15.
    Fleischer, G.: Attacking Tor at the Application Layer. Presentation at DEFCON 17 (2009),
  16. 16.
    CSS history hack demonstration,
  17. 17.
  18. 18.
    Narayanan, A., Shmatikov, V.: Robust De-anonymization of Large Sparse Datasets 2(2), 108 (2008)Google Scholar
  19. 19.
    Perry, M.: Torbutton Design Doccumentation (2009),

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Peter Eckersley
    • 1
  1. 1.Electronic Frontier FoundationUSA

Personalised recommendations