Skip to main content
Log in

Adverse consequences of access to individuals’ information: an analysis of perceptions and the scope of organisational influence

  • Empirical Research
  • Published:
European Journal of Information Systems

Abstract

Organisations are highly interested in collecting and analysing customer data to enhance their service offerings and customer interaction. However, individuals increasingly fear how such practices may negatively affect them. Although previous studies have investigated individuals’ concerns about information privacy practices, the adverse consequences people associate with external actors accessing their personal information remain unclear. To mitigate customers’ fears, organisations need to know which adverse consequences individuals are afraid of and how to address those negative perceptions. To investigate this topic, we conducted 22 focus groups with 119 participants. We developed a comprehensive conceptualisation and categorisation of individuals’ perceived adverse consequences of access to their information that includes seven types of consequences: psychological, social, career-related, physical, resource-related, prosecution-related, and freedom-related. Although individuals may limit their interactions with an organisation owing to consequences they associate with both the organisation and other actors, organisations can apply preventive and corrective mechanisms to mitigate some of these negative perceptions. However, organisations’ scope of influence is limited and some fears may be mitigated only by individuals themselves or government regulation, if at all.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1

Similar content being viewed by others

Notes

  1. We thank an anonymous reviewer for highlighting this possibility.

References

  • Abbasi A, Sarker S and Chiang RH (2016) Big data research in information systems: toward an inclusive research agenda. Journal of the Association for Information Systems 17(2), i–xxxii.

  • Acquisti A, Brandimarte L and Loewenstein G (2015) Privacy and human behavior in the age of information. Science 347(6221), 509–514.

    Article  Google Scholar 

  • Altman I (1975) The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowding. Brooks/Cole Publishing Company, Monterey, CA.

    Google Scholar 

  • Anderson CL and Agarwal R (2011) The digitization of healthcare: boundary risks, emotion, and consumer willingness to disclose personal health information. Information Systems Research 22(3), 469–490.

    Article  Google Scholar 

  • Barbour R (2008) Doing Focus Groups. Sage, London.

    Google Scholar 

  • Bauer R (1960) Consumer behavior as risk taking. In Dynamic Marketing for a Changing World, pp 389–398, American Marketing Association, Chicago, IL.

  • BCG (2013) The value of our digital identity. https://www.bcgperspectives.com/content/articles/digital_economy_consumer_insight_value_of_our_digital_identity/. Accessed June 29, 2013.

  • Bélanger F (2012) Theorizing in information systems research using focus groups. Australasian Journal of Information Systems 17(2), 109–135.

    Article  Google Scholar 

  • Bélanger F and Crossler RE (2011) Privacy in the digital age: a review of information privacy research in information systems. Management Information Systems Quarterly 35(4), 1017–1042.

    Article  Google Scholar 

  • BITKOM (2015) Internetnutzer gehen pragmatisch mit Datenschutz um. https://www.bitkom.org/Presse/Presseinformation/Internetnutzer-gehen-pragmatisch-mit-Datenschutz-um.html. Accessed January 22, 2016.

  • Chen J, Ping W, Xu Y and Tan B (2009) Am I afraid of my peers? Understanding the antecedents of information privacy concerns in the online social context. In Proceedings of the Thirtieth International Conference on Information Systems (Chen H and Slaugther S, Eds), Association for Information Systems, Phoenix, AZ.

  • Chiu C-M, Wang ETG, Fang Y-H and Huang H-Y (2014) Understanding customers’ repeat purchase intentions in b2c e-commerce: the roles of utilitarian value, hedonic value and perceived risk. Information Systems Journal 24(1), 85–114.

    Article  Google Scholar 

  • Conger S, Pratt JH and Loch KD (2013) Personal information privacy and emerging technologies. Information Systems Journal 23(5), 401–417.

    Article  Google Scholar 

  • Corbin J and Strauss A (2008) Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Sage, Newbury Park.

    Book  Google Scholar 

  • Corbin JM and Strauss A (1990) Grounded theory research: procedures, canons, and evaluative criteria. Qualitative Sociology 13(1), 3–21.

    Article  Google Scholar 

  • Cunningham SM (1967) The major dimensions of perceived risk. In Risk Taking and Information Handling in Consumer Behavior (Cox DF, Ed), pp 82–108, Harvard University Press, Boston, MA.

    Google Scholar 

  • Dinev T (2014) Why would we care about privacy? European Journal of Information Systems 23(2), 97–102.

    Article  Google Scholar 

  • Dinev T and Hart P (2006) An extended privacy calculus model for e-commerce transactions. Information Systems Research 17(1), 61–80.

    Article  Google Scholar 

  • Dinev T, Xu H, Smith JH and Hart P (2013) Information privacy and correlates: an empirical attempt to bridge and distinguish privacy-related concepts. European Journal of Information Systems 22(3), 295–316.

    Article  Google Scholar 

  • Dowling GR (1986) Perceived risk: the concept and its measurement. Psychology and Marketing 3(3), 193–210.

    Article  Google Scholar 

  • EMC (2016) EMC privacy index. http://www.emc.com/campaign/privacy-index/index.htm?pid=home-emcprivacyindex-120614. Accessed September 29, 2016.

  • Faja S and Trimi S (2006) Influence of the web vendor’s interventions on privacy-related behaviors in e-commerce. Communications of the Association for Information Systems 17, 2–68.

    Google Scholar 

  • Featherman MS and Pavlou PA (2003) Predicting e-services adoption: a perceived risk facets perspective. International Journal of Human-Computer Studies 59(4), 451–474.

    Article  Google Scholar 

  • Fern EF (2001) Advanced Focus Group Research. Sage, London.

    Book  Google Scholar 

  • Glover S and Benbasat I (2010) A comprehensive model of perceived risk of e-commerce transactions. International Journal of Electronic Commerce 15(2), 47–78.

    Article  Google Scholar 

  • Greenaway KE and Chan YE (2013) Designing a customer information privacy program aligned with organizational priorities. Management Information Systems Quarterly Executive 12(3), 137–150.

    Google Scholar 

  • Hann I-H, Hui K-L, Lee S-YT and Png IPL (2007) Overcoming online information privacy concerns: an information-processing theory approach. Journal of Management Information Systems 24(2), 13–42.

    Article  Google Scholar 

  • Hong W and Thong JYL (2013) Internet privacy concerns: an integrated conceptualization and four empirical studies. MIS Quarterly 37(1), 275–298.

    Article  Google Scholar 

  • Hui K-L, Teo HH and Lee S-YT (2007) The value of privacy assurance: an exploratory field experiment. MIS Quarterly 31(1), 19–33.

    Article  Google Scholar 

  • Jacoby J and Kaplan LB (1972) The components of perceived risk. In Proceedings of the Third Annual Conference of the Association for Consumer Research (Venkatesan M, Ed), pp 382–393, Association for Consumer Research, Chicago, IL.

  • Jeong B-K, Zhao K and Khouja M (2012) Consumer piracy risk: conceptualization and measurement in music sharing. International Journal of Electronic Commerce 16(3), 89–118.

    Article  Google Scholar 

  • Jiang Z, Heng CS and Choi BC (2013) Privacy concerns and privacy-protective behavior in synchronous online social interactions. Information Systems Research 24(3), 579–595.

    Article  Google Scholar 

  • Junglas IA, Johnson NA and Spitzmüller C (2008) Personality traits and concern for privacy: an empirical study in the context of location-based services. European Journal of Information Systems 17(4), 387–402.

    Article  Google Scholar 

  • Kim DJ, Yim M, Sugumaran V and Rao HR (2016) Web assurance seal services, trust and consumers’ concerns: an investigation of e-commerce transaction intentions across two nations. European Journal of Information Systems 25(3), 252–273.

    Article  Google Scholar 

  • Kim K and Kim J (2011) Third-party privacy certification as an online advertising strategy: an investigation of the factors affecting the relationship between third-party certification and initial trust. Journal of Interactive Marketing 25(3), 145–158.

    Article  Google Scholar 

  • Kitzinger J (1995) Qualitative research. Introducing focus groups. BMJ: British Medical Journal 311(7000), 299–302.

    Article  Google Scholar 

  • Krasnova H, Günther O, Spiekermann S and Koroleva K (2009) Privacy concerns and identity in online social networks. Identity in the Information Society 2(1), 39–63.

    Article  Google Scholar 

  • Krasnova H, Spiekermann S, Koroleva K and Hildebrand T (2010) Online social networks: why we disclose. Journal of Information Technology 25(2), 109–125.

    Article  Google Scholar 

  • Lazarus RS and Folkman S (1984) Stress, Appraisal, and Coping. Springer, New York, NY.

    Google Scholar 

  • Lincoln YS and Guba EG (1985) Naturalistic Inquiry. Sage, Newbury Park, CA.

    Google Scholar 

  • Luo X, Li H, Zhang J and Shim JP (2010) Examining multi-dimensional trust and multi-faceted risk in initial acceptance of emerging technologies: an empirical study of mobile banking services. Decision Support Systems 49(2), 222–234.

    Article  Google Scholar 

  • Malhotra NK, Kim SS and Agarwal J (2004) Internet users’ information privacy concerns (iuipc): the construct, the scale, and a causal model. Information Systems Research 15(4), 336–355.

    Article  Google Scholar 

  • McKinsey & Company (2013) Perspectives on retail and consumer goods. http://www.mckinsey.com/client_service/retail/latest_thinking/perspectives_spring_2013. Accessed January 22, 2016.

  • Meichenbaum D (1977) Cognitive-Behaviour Modification: An Integrative Approach. Springer, New York, NY.

    Book  Google Scholar 

  • Miltgen CL and Peyrat-Guillard D (2014) Cultural and generational influences on privacy concerns: a qualitative study in seven European countries. European Journal of Information Systems 23(2), 103–125.

    Article  Google Scholar 

  • Mitchell V-W (1999) Consumer perceived risk: conceptualisations and models. European Journal of Marketing 33(1/2), 163–195.

    Article  Google Scholar 

  • Mitchell V-W and Greatorex M (1993) Risk perception and reduction in the purchase of consumer services. Service Industries Journal 13(4), 179–200.

    Article  Google Scholar 

  • Mothersbaugh DL, Foxx WK, Beatty SE and Wang S (2012) Disclosure antecedents in an online service context the role of sensitivity of information. Journal of Service Research 15(1), 76–98.

    Article  Google Scholar 

  • Oetzel MC and Spiekermann S (2014) A systematic methodology for privacy impact assessments: a design science approach. European Journal of Information Systems 23(2), 126–150.

    Article  Google Scholar 

  • Parks R, Xu H, Chu C-H and Lowry PB (2016) Examining the intended and unintended consequences of organisational privacy safeguards. European Journal of Information Systems 26(1), 37–65.

    Article  Google Scholar 

  • Petronio S (2002) Boundaries of Privacy: Dialectics of Disclosure. State University of New York Press, Albany, NY.

    Google Scholar 

  • Posey C, Lowry PB, Roberts TL and Ellis TS (2010) Proposing the online community self-disclosure model: the case of working professionals in France and the U.K. who use online communities. European Journal of Information Systems 19(2), 181–195.

    Article  Google Scholar 

  • Preibusch S (2013) Guide to measuring privacy concern: review of survey and observational instruments. International Journal of Human-Computer Studies 71(12), 1133–1143.

    Article  Google Scholar 

  • Sánchez Abril P, Levin A and Del Riego A (2012) Blurred boundaries: social media privacy and the twenty-first-century employee. American Business Law Journal 49(1), 63–124.

    Article  Google Scholar 

  • Sarker S and Sarker S (2009) Exploring agility in distributed information systems development teams: an interpretive study in an offshoring context. Information Systems Research 20(3), 440–461.

    Article  Google Scholar 

  • Van Slyke C, Shim JT, Johnson R and Jiang J (2006) Concern for information privacy and online consumer purchasing. Journal of the Association for Information Systems 7(6), 415–444.

    Google Scholar 

  • Smith HJ, Dinev T and Xu H (2011) Information privacy research: an interdisciplinary review. Management Information Systems Quarterly 35(4), 989–1016.

    Article  Google Scholar 

  • Smith HJ, Milberg SJ and Burke SJ (1996) Information privacy: measuring individuals’ concerns about organizational practices. Management Information Systems Quarterly 20(2), 167–196.

    Article  Google Scholar 

  • Stewart DW, Shamdasani PN and Rook DW (2007) Focus Groups: Theory and Practice. Sage, Newbury Park, CA.

    Book  Google Scholar 

  • Strong DM and Volkoff O (2010) Understanding organization – enterprise system fit: a path to theorizing the information technology artifact. Management Information Systems Quarterly 34(4), 731–756.

    Article  Google Scholar 

  • Symantec (2015) State of privacy report 2015. http://www.symantec.com/content/en/us/about/presskits/b-state-of-privacy-report-2015.pdf. Accessed January 22, 2016.

  • TRUSTe (2013) 2013 TRUSTe US consumer confidence index. http://www.truste.com/us-consumer-confidence-index-2013/. Accessed August 7, 2013.

  • Tsai JY, Egelman S, Cranor L and Acquisti A (2011) The effect of online privacy information on purchasing behavior: an experimental study. Information Systems Research 22(2), 254–268.

    Article  Google Scholar 

  • Wall JD, Lowry PB and Barlow JB (2016) Organizational violations of externally governed privacy and security rules: explaining and predicting selective violations under conditions of strain and excess. Journal of the Association for Information Systems 17(1), 39–76.

    Google Scholar 

  • Walsham G (1995) Interpretive case studies in is research: nature and method. European Journal of Information Systems 4(2), 74–81.

    Article  Google Scholar 

  • Westin AF (1967) Privacy and Freedom. Atheneum Press, New York, NY.

    Google Scholar 

  • Wu Y, Ryan S and Windsor J (2009) Influence of social context and affect on individuals’ implementation of information security safeguards. In Proceedings of the Thirtieth International Conference on Information Systems (Chen H and Slaugther S, Eds), Association for Information Systems, Phoenix, AZ.

  • Xu H, Dinev T, Smith J and Hart P (2011) Information privacy concerns: linking individual perceptions with institutional privacy assurances. Journal of the Association for Information Systems 12(12), 798–824.

    Google Scholar 

  • Xu H, Teo H-H, Tan BC and Agarwal R (2009) The role of push-pull technology in privacy calculus: the case of location-based services. Journal of Management Information Systems 26(3), 135–174.

    Article  Google Scholar 

Download references

Acknowledgements

We thank the editors and three anonymous reviewers for their feedback and guidance throughout the review process. In addition, we are grateful to Lisa Heller and our other student research assistants Katja Englberger, Viktoria Schlichte, and Christin Schaller for their support of this research project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sabrina Karwatzki.

Additional information

Special Issue Editors: Paul Benjamin Lowry, Tamara Dinev, Robert Willison

Appendices

Appendix A: Focus group set-up

Focus group

Composition

Age [average]

Number of participants [female/male]

Duration in min

FG1

Pupils, 8th grade

13–14 [13.6]

6 [2/4]

40

FG2

Pupils, 10th grade

15–16 [15.5]

10 [7/3]

75

FG3

Pupils, 11th grade

17–18 [17.3]

8 [3/5]

90

FG4

Bachelor students

18–25 [20.6]

5 [2/3]

75

FG5

Bachelor and Master students

20–27 [23.0]

5 [1/4]

70

FG6

Bachelor and Master students

21–22 [21.6]

5 [5/0]

70

FG7

Bachelor students

21–23 [22.0]

5 [4/1]

60

FG8

Bachelor and Master students

22–24 [23.0]

4 [2/2]

70

FG9

Bachelor and Master students

21–25 [23.8]

5 [2/3]

100

FG10

Bachelor and Master students

21–25 [23.0]

5 [1/4]

100

FG11

Employees in a canteen

24–43 [31.0]

5 [3/2]

45

FG12

Calculation and aerospace engineers, management consultants, engineer in control and automation technology

24–58 [40.6]

5 [2/3]

90

FG13

Business engineer, insurance employee at the executive office of IT, sales representative, industrial clerk, sales representative

25–28 [26.0]

5 [2/3]

70

FG14

Head of marketing and product management, nurse, assistant office manager in a bank branch, employee in an engineering office, physiotherapist, architectural drafter

25–29 [26.5]

6 [3/3]

75

FG15

Factory mechanic, apprentice for health service clerk, educator, car body and vehicle builder, farmer

25–29 [27.0]

5 [2/3]

75

FG16

Truck driver, master craftsman in precision mechanics, industrial clerk, gardener, farmer

25–29 [27.8]

5 [3/2]

80

FG17

Journalists

28–68 [39.6]

5 [2/3]

95

FG18

Farmers, master electrician, agricultural engineer

38–51 [45.0]

4 [0/4]

40

FG19

Housewives and farmers

44–51 [47.6]

5 [5/0]

60

FG20

Tool set-up worker, employee at a publishing house, employees at a metal working company, teacher

45–62 [51.4]

5 [2/3]

65

FG21

Banker, police officer, group leader in quality assurance sector, lecturer in adult education, employee in an educational institute

50–58 [53.8]

5 [2/3]

105

FG22

Psychiatrists, psychoanalysts, psychotherapist, medical doctor

63–71 [65.2]

6 [2/4]

95

Appendix B: Interview guide for focus group conduction

  1. 1.

    Round of introductions

  2. 2.

    Activities and types of information that individuals perform/disclose online

    1. (a)

      Which online activities do you perform (e.g. usage of social networks, search engines, e-mails, online banking, online shopping, cloud services…)?

    2. (b)

      Which information do you share when performing these activities?

    3. (c)

      Which activities do you NOT perform online? Why not?

  3. 3.

    Information sensitivity

    1. (a)

      What is sensitive information? Why are these types of information sensitive?

  4. 4.

    Privacy concerns and privacy violations

    1. (a)

      Which concerns do you have when sharing personal information online (with respect to the activities that you do and don’t perform)?

    2. (b)

      How could this information be misused? Who would be interested in using your disclosed information?

    3. (c)

      Which consequences could arise for you? Which losses do you associate with information disclosure?

    4. (d)

      Have you (or any person you know) ever been subject to a privacy violation?

    5. (e)

      Are there any activities that you perform, even though you see negative consequences that could arise? If yes, why?

    6. (f)

      Have you ever consciously changed your behaviour due to perceived risks?

  5. 5.

    Mitigation mechanisms

    1. (a)

      What do you do to actively manage your privacy?

    2. (b)

      Who is responsible for managing your privacy?

    3. (c)

      What do you expect organisations to do with respect to your privacy?

Appendix C: Excerpt coding

figure a

Appendix D: Evaluating trustworthiness (based on Lincoln and Guba (1985))

Evaluative criteria

Goal

Appraisal

Credibility

Assessment whether the results are believable

To ensure credibility, we followed the established guidelines by Fern (2001) on how to conduct focus groups. As we gathered data from 22 different focus groups, we gained a variety of insights from our 119 participants who varied in several dimensions such as age and experience as described above. Yet, in the end there was a lot of repetition of the different types of consequences being discussed in the focus groups so that we reached data saturation. Our focus group moderators were also well trained and thoughtfully guided the focus groups so that we are confident that our findings are congruent with perceived reality

Transferability

Assessment whether the results can be applied to other contexts

As the aim of our research is to provide a general categorisation of perceived adverse consequences that can arise when other parties have access to individuals’ information, our findings are generally context-independent. These general findings enable context-specific research on perceived adverse consequences and enable future researchers to elaborate on those categories that are most relevant in their specific setting

Dependability

Assessment whether the findings are consistent

In line with Parks et al (2016), we ensured dependability by using inquiry audits: besides the authors themselves who assessed the process of focus group conduction as well as data coding and analysis, three colleagues at the authors’ departments and four research assistants critically evaluated those steps, e.g. by scrutinising the interview guide and the identified concepts

Confirmability

Assessment whether the results are confirmable

All focus group transcripts were coded not only by the first author but also by an experienced research assistant to ensure that the findings accurately reflect the data. We also collected and integrated feedback from other researchers (professors and doctoral students outside our departments) that we received from various presentations and discussions

Appendix E: Empirical grounding of the study (based on Corbin and Strauss (2008))

Evaluative criteria

What to look for in this study

Criterion 1: Are concepts generated?

All presented concepts are grounded in the data. We derived them by conducting open and axial coding and constantly comparing and refining the evolving concepts. Exemplary quotes that illustrate our concepts are depicted in Tables 4, 5, 6, 7, 8, 9, 10 and “Appendix G

Criterion 2: Are the concepts systematically related?

Our findings show how our concepts relate to each other. For example, we show that adverse consequences of access to individuals’ information can be classified into seven categories. Moreover, we uncover further systematic relations by linking adverse consequences and the actors that may be seen as source of these consequences

Criterion 3: Are there many conceptual linkages and are the categories well developed? Do they have conceptual density?

We employed open and axial coding. Throughout these processes, concepts emerged were linked to each other and several early concepts were condensed into overarching categories. Thereby, we also ensured conceptual density of the categories by identifying and specifying the properties of all categories in detail

Criterion 4: Is much variation built into the theory?

Our study focuses on providing an extensive framework of all adverse consequences that may arise from access to individuals’ information, associates the consequences with actors, and provides a discussion of the organisational scope of influence on mitigating those negative perceptions. Thus, our results are independent of a specific setting and serve as an overarching framework that can be contextualised in future studies that investigate information access in specific situations

Criterion 5: Are the broader conditions that affect the study built into its explanation?

Although the aim of our study was to identify adverse consequences of information access independent of a specific context and to further analyse the role of actors in general, we expect that the importance of adverse consequences varies across contexts. In that sense, our framework facilitates a deeper understanding of how broader conditions impact adverse consequences

Criterion 6: Has process been taken into account?

This study identifies several mechanisms that can mitigate negative perceptions associated with access to individuals’ information. We discuss conditions under which change may occur in section “Findings

Criterion 7: Do the theoretical findings seem significant and to what extent?

As our findings are context-independent, we think that they can serve as a fundamental starting point for future research that aims at investigating context-specific privacy phenomena, in particular individuals’ perception of situations where they fear an intrusion of privacy. We deem this as support of the significance of our findings

Criterion 8: Does the theory stand the test of time and become part of the discussions and ideas exchanged amongst relevant social and professional groups?

We are confident that our general framework of adverse consequences of access to individuals’ information can inspire future research. In particular, we suggest conducting more contextualised studies which apply our findings to better understand individuals’ privacy perceptions and behavioural reactions in specific circumstances. The identified framework of adverse consequences of access to individuals’ information is comprehensive and should be stable over time; however, it is possible that the importance of different adverse consequences may change. Our types of adverse consequences allow a deeper investigation of such potentially salient evaluations

Appendix F: Research process evaluation criteria (based on Corbin and Strauss (2008))

Evaluative criteria

What to look for in this study

Criterion 1: How was the original sample selected? On what grounds?

We conducted 22 focus groups with 119 participants. We sought across-group heterogeneity and within-group homogeneity to collect insights from many different perspectives to ensure that no interesting findings are missed, while we preserved a homogeneous environment that fostered openness to sharing experiences that are as rich and detailed as possible

Criterion 2: What major categories emerged?

Seven categories of adverse consequences emerged: physical, social, resource-related, psychological, prosecution-related, career-related, and freedom-related consequences. Moreover, we identified six actors who can be the sources of those consequences and identified several mitigation mechanisms, which we also mapped to the different actors

Criterion 3: What were some of the events, incidents or actions (indicators) that pointed to some of these categories?

In each of the focus group sessions, the moderators encouraged the participants to share as many privacy-related experiences and incidents as possible and discussed each of them in depth. Transcripts of those stories, the perceptions of situations, and discussions amongst the participants were then coded and led to the categories and linkages reported in the “Findings” section and exemplified in Tables 4, 5, 6, 7, 8, 9, 10 and “Appendix G

Criterion 4: On the basis of what categories did theoretical sampling proceed? That is, how did theoretical formulations guide some of the data collection? After the theoretical sampling was done, how representative did the categories prove to be?

Adverse consequences are the fundamental concept around which our study evolved. Thus, we followed an iterative process and gathered additional focus group data as long as new insights in terms of new consequences, new linkages to actors, or new mitigation mechanisms emerged. At the same time, we also ensured having a heterogeneous sample to not miss out important concepts. In the beginning, several of our concepts were very specific and fine-grained. Later, they collapsed into categories on a higher level of abstraction

Criterion 5: What were some of the hypotheses pertaining to conceptual relations (i.e. among categories), and on what grounds were they formulated and validated?

Based on our data and an interpretation of them, we came up with hypotheses early on during data analysis. Examples of those hypotheses include that not only private contacts but also professional contacts can be the source of social consequences or that individuals see organisations as responsible for reducing other actors’ access to personal information

Criterion 6: Were there instances in which hypotheses did not explain what was happening in the data? How were these discrepancies accounted for? Were hypotheses modified?

As the coding process continued, categories and linkages of those categories emerged. Several of these categories and related hypotheses held up, while others did not. For example, in the beginning, we had the impression that individuals mainly hold organisations responsible for mitigating the fears of adverse consequences. We eventually modified and refined this hypothesis as we discovered that several of the consequences are associated with other actors and cannot be controlled by organisations nor can their impact be mitigated by organisations

Criterion 7: How and why was the core category selected? Was this collection sudden or gradual, and was it difficult or easy? On what grounds were the final analytic decisions made?

Early on during the focus groups, when we investigated participants’ privacy encounters, we discovered that they talked about concrete adverse consequences that they were afraid of rather than abstract concerns. We thus selected adverse consequences as the core category of our study. While other categories also emerged early on, all other concepts are linked to adverse consequences. This decision is grounded in our analysis and was validated with our focus group transcripts

Appendix G: Exemplary quotes for actors and associated consequences

Actor

Adverse consequence of information access

Exemplary quote from focus group

Organisation

Freedom-related

“Yes, as we said before with purchasing behaviour, so it is completely influenced and they first identify what interests me and thereby they make profits because then the advertisement shows up everywhere, it depicts of what I’ve already looked at or similar things that I might like. Of course I might see something then that I like and maybe I click on it. If they create those profiles of people, they also know what that person likes and that influences the purchasing behaviour for sure” (P6, FG3)

Psychological

Participant 10: “Yes exactly, so, here on my cell phone I have, I used to have some kind of location-tracking protocol from Google. I turned that off now. I could see when I…I went on Google and then you could check where the phone was. So you could see exactly where you were walking, how long, how fast

Participant 9: “How precise was that?”

Participant 10: “It was precise up to 10 meters. Precise up to 10 meters! And on the map you could really see that, I don’t know, I’m at the airport and I just passed the security check when I flew to China. And that was a little bit scary” (P9 and P10, FG2)

Resource-related

Participant 2: “I just wanted to say, I do only use certain online shops. For example, I use Zalando, I buy stuff there, but I would not buy things at any smaller shops, sometimes you see these websites where the offer ideas for presents etc. I would never buy anything there”

Moderator: “Why is that?”

Participant 2: “I’m afraid that they’ll either never get my money […] Or that they clear out my account” (P2, FG15)

Private contacts

Social

“It’s the same thing in clubs. If you are responsible for young club members and there is some kind of picture, you boozing with young people, then immediately other club members or people of your village, because we live in a small village, say […] that’s not good company for my child” (P2, FG 16)

Physical

“Once it happened that I was threatened by someone via Facebook. I was really extremely glad that I hadn’t shared any sensitive data online. I was really in panic because if my exact residence or my real name or other things like that had been available, it would be much easier to find me in a city not too big such as Munich” (P4, FG10)

Psychological

“For me it is also that I don’t want everybody to know everything about my life at a glance. Because I think that today, if you tell somebody your name on Facebook, that person knows a lot, if you were tagged on pictures somehow and even though you didn’t disclose a lot of information, but you still see the person has these friends, went to that school or lives in that town. So I think, you can get a lot of information that way. I don’t want any additional information to be so easily accessible there” (P2, FG9)

Professional contacts

Career-related

“Regarding the whole job aspects – it’s not my experience but that of a former colleague of mine – she was sick-listed for the whole week and at the weekend she went to a party where some picture were taken, which were then published on the webpage of a local newspaper. Our apprenticeship instructor saw it and she got a warning letter” (P4, FG5)

Social

“I ran into difficulties with someone on a professional level and then I took the time and did research on the Internet what kind of a person he is or how his business is rated and a poor light was cast upon him. And then it took me a lot of time until I said, okay, now I’ll invest a day and drive out there and meet him in person and by doing that I realised how much reality I had lost through the Internet. So when I perceive someone face-to-face and talk to that person and look them in the eyes and say, what is going on there and how the person reacts, that is a much greater range of information that I get and that is the danger I see that we rely more and more on information available on the Internet and take it for reality which actually isn’t true” (P3, FG12)

Third-party organisations

Freedom-related

“That will be taken into greater account, especially with health insurance and things like that. If you, whatever, look up an illness several times on Google or check what it might be, it’s possible that they can already retrace, something is going on with him. Then you want supplementary insurance, then they see that, the person already looked that up three times and then they already have concerns” (P3, FG18)

Psychological

“As I said before, [I would be afraid] that someone has an extensive profile of me, with all my habits, what I buy every month and what I spend money on. I think that’s nothing but my private business. It’s not that I have anything to hide or do something criminal, it’s just that third parties should not know what I spend my money on” (P2, FG8)

Resource-related

“I see hazardous potential here. It’ll probably be only accomplished in five to ten years, but then you’ll be under constant surveillance, like happy cows, and someone will calculate our remaining life time and then adjust insurance policies to that. And everyone who does not want to be under surveillance will automatically be placed into a worse category, will be subliminally criminalised, with the assumption that he must have a good reason for acting like this. That’s what I’m afraid of” (P3, FG12)

Criminals

Prosecution-related

“I don’t know, if one registers at a pornographic website or any other illegal website or downloads something, a video, using my account, yes and then I have the problem. Then I might be reported to the police or something like that” (P4, FG6)

Physical

Participant 2: “On social media by criminals, just by persons that deliberately want to harm somebody possibly using cyberbullying. Or also specifically by perpetrators who want to kill or waylay someone, who want to stalk. I would say there is a considerable risk because everybody can get access to the data without having to specifically hack into something or…”

Moderator: “Do you mean things like your location or your address by that?”

Participant 2: “For example or everything you post there, pictures with friends and so on. Because he can find out everything about your life and then of course do something bad” (P2 and Moderator, FG5)

Resource-related

“Maybe I might also be afraid of somebody, who has a cost profile of me, knowing, oh she booked a vacation for this and that week, she won’t be home then. Criminals might know that the place is vacant. One often doesn’t know who’s behind it. So that’s what I would be afraid of” (P2, FG8)

Intelligence services

Prosecution-related

Participant 3: Besides the fact that I’m not a criminal, I would never go to pornographic webpages, they always appear to be so unsafe.”

Moderator: “What is unsafe?”

Participant 3: “Pornographic webpages. All that stuff. I’m not interested in that, but I would never search for something like that, it’s so dubious”

Moderator: “Well, what could they do with your data?”

Participant 3: “I would be afraid that the police is immediately at my front door, just because I click on such a webpage” (P3 and Moderator, FG16)

Physical

“If I think of drones being controlled by the Americans, they use your mobile phone, then [the misuse of your location data] may result in your death. […] Someone informs them about your name and says that you are a bastard. And then they use your mobile phone number and shoot this mobile phone” [P4, FG12)

Psychological

Participant 2: “I personally know that I reveal personal information via Facebook and WhatsApp, but I accept the risks to stay in touch with my friends. Even if I still watch out, but I’m aware that I release data there”

Participant 5: “Whereas, I assume that large organisations, actually intelligence services, pose the actual danger here because all of that data has to be analysed. So it can, I don’t see the danger that some, let’s say a hacker, some criminal can make use of it. That is only possible on a large scale” (P2 and P5, FG12)

Appendix H: Exemplary quotes for a comparison of consequences across age groups

To get an understanding of potential differences, we further analysed our data after the axial coding process.Footnote 1 As described in section “Research method”, we conducted exploratory focus groups to get an overview of all possible perceptions about adverse consequences. In line with this research goal, all discussions were very broad and open and each focus group covered the topics regarding access to individuals’ information that our participants experienced or were knowledgeable about. The discussions were based on situations and contexts that they usually engage in. Thus, counting individual occurrences does not do justice to the research method and might foster overinterpretation. Comparing occurrences numerically could even lead to wrong conclusions if certain types are excessively mentioned in one group but rarely mentioned in others groups, just because discussions in this group focused on different experiences. Yet, we were interested whether our data would give us an indication on potential differences between groups that could be further explored in future research. We therefore carefully reanalysed it to identify extreme differences between the groups. However, we did not find any noticeable differences with respect to age or background. Our analysis reveals that all categories of consequences were mentioned throughout all age groups and across backgrounds several times. The following table provides an exemplary illustration of this result by exhibiting one exemplary citation for each combination of category and age (Tables 12, 13, 14, 15, 16, 17, 18).

Table 12 Prosecution-related consequences
Table 13 Social consequences
Table 14 Physical consequences
Table 15 Career-related consequences
Table 16 Freedom-related consequences
Table 17 Psychological consequences
Table 18 Resource-related consequences

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Karwatzki, S., Trenz, M., Tuunainen, V.K. et al. Adverse consequences of access to individuals’ information: an analysis of perceptions and the scope of organisational influence. Eur J Inf Syst 26, 688–715 (2017). https://doi.org/10.1057/s41303-017-0064-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1057/s41303-017-0064-z

Keywords

Navigation