Journal of Business and Psychology

, Volume 34, Issue 1, pp 71–86 | Cite as

Reactance to Electronic Surveillance: a Test of Antecedents and Outcomes

  • Allison Brown Yost
  • Tara S. BehrendEmail author
  • Garett Howardson
  • Jessica Badger Darrow
  • Jaclyn M. Jensen
Original Research


Organizations are increasingly using electronic surveillance to observe their employees’ behavior. This controversial practice has yet-unknown effects on employee behavior or attitudes. We seek to examine the effects of electronic surveillance on contextual performance, drawing from the perspective of psychological reactance theory. Data were collected from employees of a variety of organizations (n = 238). Respondents indicated the ways their organizations used electronic surveillance and described their reactions using open-ended comments, and then completed measures of state and trait reactance, organizational citizenship behavior (OCB), and counterproductive work behavior (CWB). Invasion of privacy was linked with state reactance. Anger, one component of state reactance, was associated with increased CWB-O and decreased OCB-O. Further, not only did trait reactance predict invasion of privacy perceptions, but it also explained significant variance in CWB and OCB. After controlling for trait reactance, anger was no longer a significant predictor of these outcomes. This study provides evidence that perceptions of monitoring systems affect contextual performance and that individual differences affect reactions to surveillance. This knowledge can be used to inform the design, implementation, and communication of monitoring systems. This study contributes to the electronic monitoring and reactance literatures by being the first to link invasion of privacy, trait and state reactance, and contextual performance. People experience state reactance in response to invasion of privacy; this suggests that psychological reactance theory can be used to inform the design of monitoring systems. We provide evidence that surveillance is associated with negative consequences.


Electronic surveillance Electronic monitoring Personality Psychological reactance Invasion of privacy Organizational citizenship behavior Counterproductive work behavior 



The authors wish to thank Peter Yu and Cecilia Ramirez for their assistance with data collection. David Costanza and Jon Willford reviewed earlier drafts of this manuscript.


  1. Aiello, J. R. (1993). Computer-based work monitoring: Electronic surveillance and its effects. Journal of Applied Social Psychology, 23(7), 499–507.CrossRefGoogle Scholar
  2. Aiello, J. R., & Kolb, K. J. (1995). Electronic performance monitoring and social context: Impact on productivity and stress. Journal of Applied Psychology, 80(3), 339–353. CrossRefPubMedGoogle Scholar
  3. Aiello, J. R., & Svec, C. M. (1993). Computer monitoring of work performance: Extending the social facilitation framework to electronic presence. Journal of Applied Social Psychology, 23(7), 537–548. CrossRefGoogle Scholar
  4. Alge, B. J. (2001). Effects of computer surveillance on perceptions of privacy and procedural justice. Journal of Applied Psychology, 86(4), 797–804.CrossRefGoogle Scholar
  5. Alge, B. J., Ballinger, G., Tangirala, S., & Oakley, J.L. (2006a). Information privacy in organizations: Empowering creative and extrarole performance. Journal of Applied Psychology, 91(1), 221–232.
  6. Alge, B. J., Greenberg, J., & Brinsfield, C. T. (2006b). An identity-based model of organizational monitoring: Integrating information privacy and organizational justice. In J. J. Martocchio (Ed.), Research in personnel and human resources management. Volume 25. Elsevier JAI.Google Scholar
  7. American Management Association (2007). 2007 electronic monitoring & surveillance survey. Retrieved from
  8. Barger, P., Behrend, T. S., Sharek, D. J., & Sinar, E. F. (2011). I-O and the crowd: Frequently asked questions about using Mechanical Turk for research. The Industrial-Organizational Psychologist, 49(2), 11–17.Google Scholar
  9. Behrend, T. S., Sharek, D. J., Meade, A. W., & Wiebe, E. N. (2011). The viability of crowdsourcing for survey research. Behavior Research Methods, 43, 800–813.CrossRefGoogle Scholar
  10. Bennett, R., & Robinson, S. (2000). Development of a measure of workplace deviance. Journal of Applied Psychology, 85(3), 349.CrossRefGoogle Scholar
  11. Bentler, P. M. (2007). On tests and indices for evaluating structural models. Personality and Individual Differences, 42(5), 825–829.CrossRefGoogle Scholar
  12. Berry, C. M., Carpenter, N. C., & Barratt, C. L. (2012). Do other-reports of counterproductive work behavior provide an incremental contribution over self-reports? A meta-analytic comparison. Journal of Applied Psychology, 97, 613–636.CrossRefGoogle Scholar
  13. Brehm, S. S., & Brehm, J. (1981). Psychological reactance: A theory of freedom and control. New York: Academic Press.Google Scholar
  14. Brown, A. R. (2010). An examination of the construct validity of the Hong psychological reactance scale. Masters Theses. 392. Accessed 1 January 2017.
  15. Brown, A. R., & Finney, S. J. (2011). Low-stakes testing and psychological reactance: Using the Hong Psychological Reactance Scale to better understand compliant and non-compliant examinees. International Journal of Testing, 11(3), 248–270. Routledge. CrossRefGoogle Scholar
  16. Brown, A. R., Finney, S. J., & France, M. K. (2011). Using the bifactor model to assess the dimensionality of the Hong Psychological Reactance Scale. Educational and Psychological Measurement, 71(1), 170–185.CrossRefGoogle Scholar
  17. Buboltz, W. C., Woller, K. M. P., & Pepper, H. (1999). Holland code type and psychological reactance. Journal of Career Assessment, 7(2), 161–172. CrossRefGoogle Scholar
  18. Carpenter, N. C., Berry, C. M., & Houston, L. (2014). A meta-analytic comparison of self-reported and other-reported organizational citizenship behavior. Journal of Organizational Behavior, 35, 547–574.CrossRefGoogle Scholar
  19. Chalykoff, J., & Kochan, T. A. (1989). Computer-aided monitoring: Its influence on employee job satisfaction and turnover. Personnel Psychology, 42(4), 807–834. CrossRefGoogle Scholar
  20. Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling, 9, 233–255. CrossRefGoogle Scholar
  21. Cheung, J. H., Burns, D. K., Sinclair, R. R., & Sliter, M. (2017). Amazon Mechanical Turk in organizational psychology: An evaluation and practical recommendations. Journal of Business and Psychology, 32(4), 347–361. CrossRefGoogle Scholar
  22. Costarelli, S. (2005). Short communication: Affective responses to own violations of ingroup norms: The moderating role of norm salience. European Journal of Social Psychology, 35(3), 425–435. CrossRefGoogle Scholar
  23. Dillard, J., & Shen, L. (2005). On the nature of reactance and its role in persuasive health communication. Communication Monographs, 72(2), 144–168.CrossRefGoogle Scholar
  24. Dillard, J. P., Plotnick, C. A., Godbold, L. C., Freimuth, V. S., & Edgar, T. (1996). The multiple affective outcomes of AIDS PSAs fear appeals do more than scare people. Communication Research, 23(1), 44–72.CrossRefGoogle Scholar
  25. Dixon, H. (2013). Tesco accused of monitoring staff with electronic armbands. The Telegraph. Retrieved from
  26. Dowd, E. T., & Wallbrown, F. (1993). Motivational components of client reactance. Journal of Counseling & Development, 71(5), 533–538. CrossRefGoogle Scholar
  27. Dowd, E. T., Wallbrown, F., Sanders, D., & Yesenosky, J. M. (1994). Psychological reactance and its relationship to normal personality variables. Cognitive Therapy and Research, 18(6), 601–612. Springer Netherlands. CrossRefGoogle Scholar
  28. Flanagan, J. (1994). Restricting electronic monitoring in the private workplace. Duke Law Journal, 43(6), 1256–1281.CrossRefGoogle Scholar
  29. Fogarty, J. S., & Youngs, G. A. (2000). Psychological reactance as a factor in patient noncompliance with medication taking: A field experiment. Journal of Applied Social Psychology, 30(11), 2365–2391. CrossRefGoogle Scholar
  30. Grant, R., Higgins, C. A., & Irving, R. H. (1988). Computerized performance monitors, are they costing you customers? Sloan Management Review, 29(3), 39–45.Google Scholar
  31. Griffith, T. L. (1993). Monitoring and performance: A comparison of computer and supervisor monitoring. Journal of Applied Social Psychology, 23(7), 549–572. CrossRefGoogle Scholar
  32. Hackman, J. R., & Oldham, G. R. (1976). Motivation through the design of work: Test of a theory. Organizational Behavior and Human Performance, 16, 250–279.CrossRefGoogle Scholar
  33. Hedge, J. W., & Borman, W. C. (1995). Changing conceptions and practices in performance appraisal. In A. Howard (Ed.), The changing nature of work (pp. 451–481). San Francisco: Jossey-Bass.Google Scholar
  34. Higgins, E. T., Shah, J., & Friedman, R. (1997). Emotional responses to goal attainment: Strength of regulatory focus as moderator. Journal of Personality and Social Psychology, 72(3), 515–525. CrossRefPubMedGoogle Scholar
  35. Holland, P., Cooper, B., & Hecker, R. (2015). Electronic monitoring and surveillance in the workplace. Personnel Review, 44, 161–175. CrossRefGoogle Scholar
  36. Hong, S. M., & Faedda, S. (1996). Refinement of the Hong psychological reactance scale. Educational and Psychological Measurement, 56(1), 173–182.CrossRefGoogle Scholar
  37. Hong, S. M., & Page, S. (1989). A psychological reactance scale: Development, factor structure and reliability. Psychological Reports, 64, 1323–1326.CrossRefGoogle Scholar
  38. Hu, L. T., & Bentler, P. M. (1998). Fit indices in covariance structure modeling: Sensitivity to underparameterized model misspecification. Psychological Methods, 3(4), 424.CrossRefGoogle Scholar
  39. Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55.CrossRefGoogle Scholar
  40. Irving, R. H., Higgins, C. A., & Safayeni, F. R. (1986). Computerized performance monitoring systems: Use and abuse. Management of Computing, 29(8), 794–801.Google Scholar
  41. Jensen, J. M., & Raver, J. L. (2012). When self-management and surveillance collide: Consequences for employees’ trust, autonomy, and discretionary behaviors. Group & Organization Management, 37, 308–346.CrossRefGoogle Scholar
  42. Landers, R. N., & Behrend, T. S. (2015). An inconvenient truth: Arbitrary distinctions between organizational, Mechanical Turk, and other convenience samples. Industrial and Organizational Psychology: Perspectives on Science and Practice, 8, 142–164.CrossRefGoogle Scholar
  43. Lawrence, T. B., & Robinson, S. L. (2007). Ain’t misbehavin: Workplace deviance as organizational resistance. Journal of Management, 33, 378–394.CrossRefGoogle Scholar
  44. Lee, S., & Kleiner, B. H. (2003). Electronic surveillance in the workplace. Management Research News, 26(2/3/4), 72–80. CrossRefGoogle Scholar
  45. McNall, L. A., & Stanton, J. M. (2011). Private eyes are watching you: Reactions to location sensing technologies. Journal of Business Psychology, 26, 299–309. CrossRefGoogle Scholar
  46. Nebeker, D. M., & Tatum, B. C. (1993). The effects of computer monitoring, standards, and rewards on work performance, job satisfaction, and stress. Journal of Applied Social Psychology, 23(7), 508–536. CrossRefGoogle Scholar
  47. Organ, D. W., Podsakoff, P. M., & MacKenzie, S. B. (2006). Organizational citizenship behavior: Its nature, antecedents, and consequences. Thousand Oaks, CA: Sage Publications, Inc..Google Scholar
  48. Paolacci, G., Chandler, J., & Ipeirotis, P. G. (2010). Running experiments on Amazon Mechanical Turk. Judgment and Decision making, 5(5), 411–419.Google Scholar
  49. Parenti, C. (2001). Big Brother’s corporate cousin. The Nation, 273(5), 26–31 Retrieved from Google Scholar
  50. Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method bias in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879–903.CrossRefGoogle Scholar
  51. Podsakoff, P. M., MacKenzie, S. B., & Podsakoff, N. P. (2012). Sources of method bias in social science research and recommendations on how to control it. Annual Review of Psychology, 63, 539–569.CrossRefGoogle Scholar
  52. Putka, D. J., Le, H., McCloy, R. A., & Diaz, T. (2008). Ill-structured measurement designs in organizational research: Implications for estimating interrater reliability. Journal of Applied Psychology, 93(5), 959–981. CrossRefPubMedGoogle Scholar
  53. Reis, H. T., Sheldon, K. M., Gable, S. L., Roscoe, J., & Ryan, R. M. (2000). Daily well-being: The role of autonomy, competence, and relatedness. Personality and Social Psychology Bulletin, 26(4), 419–435. CrossRefGoogle Scholar
  54. Robinson, S., & Bennett, R. (1997). Workplace deviance: Its definition, its manifestations, and its causes. Research on Negotiations in Organizations, 6, 3-27Google Scholar
  55. Rosseel, Y. (2012). Lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1–36.CrossRefGoogle Scholar
  56. Rosseel, Y. (2017). Package ‘lavaan.’ Retrieved November 11, 2017 from
  57. Sackett, P. R., Berry, C. M., Wiemann, S. A., & Laczo, R. M. (2006). Citizenship and counterproductive behavior: Clarifying relations between the two domains. Human Performance, 19(4), 441–464.CrossRefGoogle Scholar
  58. Saris, W. E., Satorra, A., & van der Veld, W. M. (2009). Testing structural equation models or detection of misspecification? Structural Equation Modeling, 16(4), 561–582. CrossRefGoogle Scholar
  59. Satorra, A., & Bentler, P. M. (1994). A scaled difference chi-square test statistic for moment structure analysis. Psychometrika, 66, 507–514. CrossRefGoogle Scholar
  60. Smith, M. J., Carayon, P., Sanders, K. J., Lim, S.-Y., & LeGrande, D. (1992). Employee stress and health complaints in jobs with and without electronic performance monitoring. Applied Ergonomics, 23(1), 17–27. CrossRefPubMedGoogle Scholar
  61. Spector, P. E. (2011). The relationship of personality to counterproductive work behavior (CWB): An integration of perspectives. Human Resources Management Review, 21(4), 342–352. CrossRefGoogle Scholar
  62. Stanton, J. M. (2000a). Reactions to employee performance monitoring: Framework, review, and research directions. Human Performance, 13(1), 85–113.CrossRefGoogle Scholar
  63. Stanton, J. M. (2000b). Traditional and electronic monitoring from an organizational justice perspective. Journal of Business and Psychology, 15(1), 129–147. CrossRefGoogle Scholar
  64. Tyler, T. R., & Blader, S. L. (2002). The group engagement model: Procedural justice, social identity, and cooperative behavior. Personality and Social Psychology Review, 7(4), 349–361. CrossRefGoogle Scholar
  65. Vandenberg, R. J., & Lance, C. E. (2000). A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods, 3(1), 4–70.CrossRefGoogle Scholar
  66. Wells, D. L., Moorman, R. H., & Werner, J. M. (2007). The impact of perceived purpose of electronic performance monitoring on an array of attitudinal variables. Human Resource Development Quarterly, 18(1), 121–138. CrossRefGoogle Scholar
  67. Westin, A. F. (1992). Two key factors that belong in a macroergonomic analysis of electronic monitoring: Employee perceptions of fairness and the climate of organizational trust or distrust. Applied Ergonomics, 23(1), 35–42. CrossRefPubMedGoogle Scholar
  68. Williams, L. J., & Anderson, S. E. (1991). Job satisfaction and organizational commitment as predictors of organizational citizenship and in-role behaviors. Journal of Management, 17, 601–617. CrossRefGoogle Scholar
  69. Williams, L. J., Hartman, N., & Cavazotte, F. (2010). Method variance marker variables: A review and comprehensive CFA marker technique. Organizational Research Methods, 13(3), 477–514. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Allison Brown Yost
    • 1
  • Tara S. Behrend
    • 2
    Email author
  • Garett Howardson
    • 3
  • Jessica Badger Darrow
    • 4
  • Jaclyn M. Jensen
    • 5
  1. 1.CEBArlingtonUSA
  2. 2.Department of Organizational Sciences and CommunicationThe George Washington UniversityWashingtonUSA
  3. 3.Tuple Work ScienceWashingtonUSA
  4. 4.US Army Research InstituteNatickUSA
  5. 5.DePaul UniversityChicagoUSA

Personalised recommendations