Wireless Personal Communications

, Volume 61, Issue 3, pp 511–526 | Cite as

A Composite Privacy Leakage Indicator

Article
  • 111 Downloads

Abstract

This paper proposes a Subjective Logic based composite privacy leakage metric that both takes into account the amount of information leakage and also that information with high entropy in some cases may be considered encrypted. It is furthermore shown both analytically and experimentally that Min-entropy is considered better than Shannon, Rényi or Max entropy for identifying encrypted content for the composite metric. This is in particular useful for implementing privacy-enhanced Intrusion Detection Systems (IDS), where sampled encrypted traffic can be considered to have low risk of revealing sensitive information. The combined metric can be used in a Policy Enforcement Point that acts as a proxy/anonymiser in order to to reduce the leakage of private or sensitive information from the IDS sensors to an outsourced Managed Security Service provider. Although the composite privacy indicator is IDS specific, the authorisation architecture is general, and may also be useful for anonymising or pseusonymising sensitive information from or to other types of sensors that need to be exposed to the Internet. The solution is based on the eXtensible Access Control Markup Language policy language extended with support for Subjective Logic, in order to provide a method for expressing fine-grained access control policies that are based on uncertain evidences.

Keywords

Privacy policy authorisation Anonymisation Subjective logic Network monitoring XACML Outsourcing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bezzi, M. (2008). An entropy based method for measuring anonymity. In Third International Conference on Security and Privacy in Communications Networks and the Workshops, 2007. SecureComm 2007, (pp. 28–32).Google Scholar
  2. 2.
    Büschkes R., Kesdogan D. (1999) Privacy enhanced intrusion detection. In: Müller G., Rannenberg K. (eds) Multilateral security in communications, information security. Addison Wesley, Reading, MA, pp 187–204Google Scholar
  3. 3.
    Clauß, S., & Schiffner, S. (2006). Structuring anonymity metrics. In Proceedings of the Second ACM Workshop on Digital Identity Management, ACM, Alexandria, Virginia, USA (pp. 55–62).Google Scholar
  4. 4.
    Fischer-Hübner S. (2007) IDA—An intrusion detection and avoidance system (in German). Aachen, ShakerGoogle Scholar
  5. 5.
    Gordon, J., & Shortliffe, E. H. (1984). The Dempster-Shafer theory of evidence. In Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project (pp. 272–292).Google Scholar
  6. 6.
    Holz, T. (2004). An efficient distributed intrusion detection scheme. In COMPSAC Workshops (pp. 39–40).Google Scholar
  7. 7.
    Jøsang, A. (1997). Artificial reasoning with subjective logic. In Proceedings of the 2nd Australian Workshop on Commonsense Reasoning, Perth, vol 65, Australian Computer Society.Google Scholar
  8. 8.
    Jøsang A. (2001) A logic for uncertain probabilities. International Journal on Uncertainty, Fuzziness and Knowledge-based Systems 9: 279–311Google Scholar
  9. 9.
    Jøsang, A. (2010). Subjective Logic (Draft book). University of Oslo. http://persons.unik.no/josang/papers/subjective_logic.pdf.
  10. 10.
    Jøsang A., Bondi V. A. (2000) Legal reasoning with subjective logic. Artificial Intelligence and Law 8(4): 289–315CrossRefGoogle Scholar
  11. 11.
    Jøsang A., McAnally D. (2005) Multiplication and comultiplication of beliefs. International Journal of Approximate Reasoning 38(1): 19–51MathSciNetCrossRefGoogle Scholar
  12. 12.
    Jøsang, A, Gollmann, D., & Au, R. (2006). A method for access authorisation through delegation networks. In Proceedings of the 2006 Australasian workshops on Grid computing and e-research—Vol 54 (pp. 165–174). Hobart, Tasmania, Australia: Australian Computer Society, Inc.Google Scholar
  13. 13.
    Mahoney, M.V., & Chan, P. K. (2003). An analysis of the 1999 darpa/lincoln laboratory evaluation data for network anomaly detection. In G. Vigna, C. Kruegel, & E. Jonsson (Eds.) Recent Advances in Intrusion Detection, Lecture Notes in Computer Science, Vol 2820 (pp. 220–237). Berlin/Heidelberg: Springer.Google Scholar
  14. 14.
    Pang, R., & Paxson, V. (2003). A high-level programming environment for packet trace anonymization and transformation. In Proceedings of the 2003 Conference on Applications, Technologies, Architectures, and Protocols for Computer Communications, ACM, Karlsruhe, Germany (pp. 339–351).Google Scholar
  15. 15.
    Rényi, A. (1961). On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability 1960, (pp. 547–561).Google Scholar
  16. 16.
    Serjantov, A., & Danezis, G. (2003). Towards an information theoretic metric for anonymity. Privacy Enhancing Technologies (pp. 259–263).Google Scholar
  17. 17.
    Shafer G. (1976) A mathematical theory of evidence. Princeton university press, Princeton, NJMATHGoogle Scholar
  18. 18.
    Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656.Google Scholar
  19. 19.
    Sobirey, M., Richter, B., & König, H. (1996). The intrusion detection system AID—architecture and experiences in automated audit trail analysis. In Proceedings of the IFIP TC6/TC11 International Conference on Communications and Mult imedia Security (pp. 278–290).Google Scholar
  20. 20.
    Sobirey, M., Fischer-Hübner, S., & Rannenberg, K. (1997). Pseudonymous audit for privacy enhanced intrusion detection. In Proceedings of the IFIP TC11 13th International Conference on Information Security (SEC’97), (pp. 151–163).Google Scholar
  21. 21.
    Svensson, H., & Jøsang, A. (2001). Correlation of intrusion alarms with subjective logic. In Proceedings of the Sixth Nordic Workshop on Secure IT systems (NordSec 2001), Copenhagen, Denmark 1–2, vol 13104, Technical Report IMM-TR-2001-14, Informatics and Mathematical Modelling, Technical University of Denmark, DTU.Google Scholar
  22. 22.
    Tavallaee, M., Bagheri, E., Lu, W., & Ghorbani, A. (2010). A detailed analysis of the KDD CUP 99 data set. In Proceedings of the Second IEEE Symposium on Computational Intelligence for Security and Defence Applications 2009.Google Scholar
  23. 23.
    Tóth, G., Hornák, Z., & Vajda, F. (2004). Measuring anonymity revisited. In Proceedings of the Ninth Nordic Workshop on Secure IT Systems (pp. 85–90).Google Scholar
  24. 24.
    Ulltveit-Moe, N., & Oleshchuk, V. (2010). Privacy leakage methodology (PRILE) for ids rules. In M. Bezzi, P. Duquenoy, S. Fischer-Hübner, M. Hansen, G. Zhang (Eds.) Privacy and Identity Management for Life, IFIP Advances in Information and Communication Technology, Vol 320 (pp. 213–225). Boston: Springer. doi: 10.1007/978-3-642-14282-6_17.
  25. 25.
    Yannacopoulos, A.N., Lambrinoudakis, C., Gritzalis, S., Xanthopoulos, S.Z., & Katsikas, S.N. (2008). Modeling privacy insurance contracts and their utilization in risk management for ICT firms. In Proceedings of the 13th European Symposium on Research in Computer Security: Computer Security (pp. 207–222). Málaga, Spain: Springer.Google Scholar

Copyright information

© Springer Science+Business Media, LLC. 2011

Authors and Affiliations

  1. 1.University of AgderGrimstadNorway

Personalised recommendations