Journal of Business and Psychology

, Volume 31, Issue 3, pp 339–359 | Cite as

Four Research Designs and a Comprehensive Analysis Strategy for Investigating Common Method Variance with Self-Report Measures Using Latent Variables

Original Paper

Abstract

Common method variance (CMV) is an ongoing topic of debate and concern in the organizational literature. We present four latent variable confirmatory factor analysis model designs for assessing and controlling for CMV—those for unmeasured latent method constructs, Marker Variables, Measured Cause Variables, as well as a new hybrid design wherein these three types of method latent variables are used concurrently. We then describe a comprehensive analysis strategy that can be used with these four designs and provide a demonstration using the new design, the Hybrid Method Variables Model. In our discussion, we comment on different issues related to implementing these designs and analyses, provide supporting practical guidance, and, finally, advocate for the use of the Hybrid Method Variables Model. Through these means, we hope to promote a more comprehensive and consistent approach to the assessment of CMV in the organizational literature and more extensive use of hybrid models that include multiple types of latent method variables to assess CMV.

Keywords

Common method variance Unmeasured latent method factor Marker Variable Measured method variable Measured Cause Variable Hybrid Method Variables Model 

References

  1. Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411–423.CrossRefGoogle Scholar
  2. Bagozzi, R. P. (1982). A field investigation of causal relations among cognitions, affect, intentions, and behavior. Journal of Marketing Research, 19, 562–584.CrossRefGoogle Scholar
  3. Bagozzi, R. P. (1984). Expectancy-value attitude models an analysis of critical measurement issues. International Journal of Research in Marketing, 1(4), 295–310.CrossRefGoogle Scholar
  4. Barrick, M. R., & Mount, M. K. (1996). Effects of impression management and self-deception on the predictive validity of personality constructs. Journal of Applied Psychology, 81, 261–272.CrossRefPubMedGoogle Scholar
  5. Bentler, P. M., & Moojart, A. (1989). Choice of structural model via parsimony: A rationale based on precision. Psychological Bulletin, 106, 315–317.CrossRefPubMedGoogle Scholar
  6. Brief, A. P., Burke, M. J., George, J. M., Robinson, B. S., & Webster, J. (1988). Should negative affectivity remain an unmeasured variable in the study of job stress? Journal of Applied Psychology, 73, 193–198.CrossRefPubMedGoogle Scholar
  7. Brown, T. A. (2006). Confirmatory factor analysis for applied research. London: Guilford Press.Google Scholar
  8. Bryant, F. B., & Satorra, A. (2012). Principles and practice of scaled difference Chi square testing. Structural Equation Modeling: A Multidisciplinary Journal, 19, 372–398.CrossRefGoogle Scholar
  9. Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56(2), 81–105.CrossRefPubMedGoogle Scholar
  10. Chan, D. (2001). Method effects of positive affectivity, negative affectivity, and impression management in self-reports of work attitudes. Human Performance, 14(1), 77–96.CrossRefGoogle Scholar
  11. Chen, P. Y., & Spector, P. E. (1991). Negative affectivity as the underlying cause of correlations between stressors and strains. Journal of Applied Psychology, 76, 398–407.CrossRefPubMedGoogle Scholar
  12. Conway, J. M., & Lance, C. E. (2010). What reviewers should expect from authors regarding common method bias in organizational research. Journal of Business and Psychology, 25, 325–334.CrossRefGoogle Scholar
  13. Dawson, J. F. (2014). Moderation in management research: What, why, when, and how. Journal of Business and Psychology, 29, 1–19.CrossRefGoogle Scholar
  14. Diener, E., Emmons, R. A., Larsen, R. J., & Griffin, S. (1985). The satisfaction with life scale. Journal of Personality Assessment, 49, 71–75.CrossRefPubMedGoogle Scholar
  15. Ding, C., & Jane, T. (2015, August). Re-examining the effectiveness of the ULMC technique in CMV detection and correction. In L. J. Williams (Chair), Current topics in common method variance, Academy of Management Conference, Vancouver, BC.Google Scholar
  16. Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50.CrossRefGoogle Scholar
  17. Ganster, D. C., Hennessey, H. W., & Luthans, F. (1983). Social desirability response effects: Three alternative models. Academy of Management Journal, 26, 321–331.CrossRefGoogle Scholar
  18. Haynes, C. E., Wall, T. D., Bolden, R. I., Stride, C., & Rick, J. E. (1999). Measures of perceived work characteristics for health services research: Test of a measurement model and normative data. British Journal of Health Psychology, 4, 257–275.CrossRefGoogle Scholar
  19. Heise, D. R., & Bohrnstedt, G. W. (1970). Validity, invalidity, and reliability. Sociological Methodology, 2, 104–129.CrossRefGoogle Scholar
  20. Johnson, R. E., Rosen, C. C., & Djurdjevic, E. (2011). Assessing the impact of common method variance on higher order multidimensional constructs. Journal of Applied Psychology, 96, 744–761.CrossRefPubMedGoogle Scholar
  21. Jöreskog, K. G. (1971). Simultaneous factor analysis in several populations. Psychometrika, 36, 409–426.CrossRefGoogle Scholar
  22. Karasek, R. (1979). Job demands, job decision latitude, and mental strain: Implications for job re-design. Administrative Science Quarterly, 24, 285–306.CrossRefGoogle Scholar
  23. Kenny, D. A., & Kashy, D. A. (1992). Analysis of multitrait-multimethod matrix by confirmatory factor analysis. Psychological Bulletin, 112, 165–172.CrossRefGoogle Scholar
  24. Kline, R. B. (2010). Principles and practice of structural equation modeling (3rd ed.). New York: Guilford Press.Google Scholar
  25. Landis, R. S. (2013). Successfully combining meta-analysis and structural equation modeling: Recommendations and strategies. Journal of Business and Psychology, 28, 251–261.CrossRefGoogle Scholar
  26. Lindell, M. K., & Whitney, D. J. (2001). Accounting for common method variance in cross-sectional research designs. Journal of Applied Psychology, 86, 114–121.CrossRefPubMedGoogle Scholar
  27. McDonald, R. P., & Ho, M. H. R. (2002). Principles and practice in reporting statistical equation analyses. Psychological Methods, 7, 64–82.CrossRefPubMedGoogle Scholar
  28. McGonagle, A. K., Fisher, G. G., Barnes-Farrell, J. L., & Grosch, J. W. (2015). Individual and work factors related to perceived work ability and labor force outcomes. Journal of Applied Psychology, 100, 376–398. doi:10.1037/a0037974.CrossRefPubMedGoogle Scholar
  29. McGonagle, A., Williams, L. J., & Wiegert, D. (2014, August). A review of recent studies using an unmeasured latent method construct in the organizational literature. In L. J. Williams (Chair), Current issues in investigating common method variance. Presented at annual Academy of Management conference, Philadelphia, PA.Google Scholar
  30. Podsakoff, P. M., MacKenzie, S. B., Lee, J. Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88, 879–903.CrossRefPubMedGoogle Scholar
  31. Podsakoff, P. M., MacKenzie, S. B., Moorman, R. H., & Fetter, R. (1990). Transformational leader behaviors and their effects on followers’ trust in leader, satisfaction, and organizational citizenship behaviors. Leadership Quarterly, 1(2), 107–142.CrossRefGoogle Scholar
  32. Podsakoff, P. M., MacKenzie, S. B., & Podsakoff, N. P. (2012). Sources of method bias in social science research and recommendations on how to control it. Annual Review of Psychology, 63, 539–569.CrossRefPubMedGoogle Scholar
  33. Podsakoff, P. M., & Organ, D. W. (1986). Self-reports in organizational research: Problems and prospects. Journal of Management, 12, 531–544.CrossRefGoogle Scholar
  34. Ragins, B. R., Lyness, K. S., Williams, L. J., & Winkel, D. (2014). Life spillovers: The spillover of fear of home foreclosure to the workplace. Personnel Psychology, 67, 763–800.CrossRefGoogle Scholar
  35. Richardson, H. A., Simmering, M. J., & Sturman, M. C. (2009). A tale of three perspectives: Examining post hoc statistical techniques for detection and correction of common method variance. Organizational Research Methods, 12, 762–800.CrossRefGoogle Scholar
  36. Rogers, W. M., & Schmitt, N. (2004). Parameter recovery and model fit using multidimensional composites: A comparison of four empirical parceling algorithms. Multivariate Behavioral Research, 39, 379–412.CrossRefGoogle Scholar
  37. Schaubroeck, J., Ganster, D. C., & Fox, M. L. (1992). Dispositional affect and work-related stress. Journal of Applied Psychology, 77, 322–335.CrossRefPubMedGoogle Scholar
  38. Schaufeli, W. B., & Bakker, A. B. (2003). The Utrecht Work Engagement Scale (UWES). Test manual. Utrecht: Department of Social & Organizational Psychology.Google Scholar
  39. Schaufeli, W. B., Bakker, A. B., & Salanova, M. (2006). The measurement of work engagement with a short questionnaire a cross-national study. Educational and Psychological Measurement, 66, 701–716.CrossRefGoogle Scholar
  40. Schmitt, N. (1978). Path analysis of multitrait-multimethod matrices. Applied Psychological Measurement, 2, 157–173.CrossRefGoogle Scholar
  41. Schmitt, N., Nason, E., Whitney, D. J., & Pulakos, E. D. (1995). The impact of method effects on structural parameters in validation research. Journal of Management, 21, 159–174.CrossRefGoogle Scholar
  42. Schmitt, N., Pulakos, E. D., Nason, E., & Whitney, D. J. (1996). Likability and similarity as potential sources of predictor-related criterion bias in validation research. Organizational Behavior and Human Decision Processes, 68(3), 272–286.CrossRefGoogle Scholar
  43. Schmitt, N., & Stults, D. M. (1986). Methodology review: Analysis of multitrait-multimethod matrices. Applied Psychological Measurement, 10(1), 1–22.CrossRefGoogle Scholar
  44. Simmering, M. J., Fuller, C. M., Richardson, H. A., Ocal, Y., & Atinc, G. M. (2015). Marker variable choice, reporting, and interpretation in the detection of common method variance: A review and demonstration. Organizational Research Methods, 18, 473–511. doi:10.1177/1094428114560023.CrossRefGoogle Scholar
  45. Smith, D. B., & Ellingson, J. E. (2002). Substance versus style: A new look at social desirability in motivating contexts. Journal of Applied Psychology, 87, 211–219.CrossRefPubMedGoogle Scholar
  46. Smith, C. S., Tisak, J., Hahn, S. E., & Schmieder, R. A. (1997). The measurement of job control. Journal of Organizational Behavior, 18, 225–237.CrossRefGoogle Scholar
  47. Spector, P. E. (2006). Method variance in organizational research truth or urban legend? Organizational Research Methods, 9, 221–232.CrossRefGoogle Scholar
  48. Spector, P. E., & Brannick, M. T. (2010). Common method issues: An introduction to the feature topic in organizational research methods. Organizational Research Methods, 13, 403–406.CrossRefGoogle Scholar
  49. Spector, P. E., Rosen, C. C., Johnson, R. E., Richardson, H. A., & Williams, L. J. (2015). Legend or legendary: A measure-centric model of method variance. Unpublished manuscript.Google Scholar
  50. Thompson, E. (2007). Development and validation of an internationally reliable short-form of the positive and negative affect schedule (PANAS). Journal of Cross-Cultural Psychology, 38, 227–242.CrossRefGoogle Scholar
  51. Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: The PANAS Scales. Journal of Personality and Social Psychology, 54, 1063–1070.CrossRefPubMedGoogle Scholar
  52. West, S. G., Wu, A. B., & Taylor, W. (2012). Model fit and model selection in structural equation modeling. In R. H. Hoyle (Ed.), Handbook of structural equation modeling. New York: Guilford Press.Google Scholar
  53. Widaman, K. F. (1985). Hierarchically nested covariance structure models for multitrait-multimethod data. Applied Psychological Measurement, 9(1), 1–26.CrossRefGoogle Scholar
  54. Williams, L. J. (2014, August). Use of an unmeasured latent method construct (ULMC) in the presence of multidimensional method variance. In L. J. Williams (Chair), Current issues in investigating common method variance. Presented at annual Academy of Management conference, Philadelphia, PA.Google Scholar
  55. Williams, L. J., & Anderson, S. E. (1994). An alternative approach to method effects by using latent-variable models: Applications in organizational behavior research. Journal of Applied Psychology, 79, 323–331.CrossRefGoogle Scholar
  56. Williams, L. J., Edwards, J. R., & Vandenberg, R. J. (2003a). Recent advances in causal modeling methods for organizational and management research. Journal of Management, 29, 903–936.CrossRefGoogle Scholar
  57. Williams, L. J., Gavin, M. B., & Williams, M. L. (1996). Measurement and nonmeasurement processes with negative affectivity and employee attitudes. Journal of Applied Psychology, 81, 88–101.CrossRefGoogle Scholar
  58. Williams, L., Hartman, N., & Cavazotte, F. (2003). Method variance and marker variables: An integrative approach using structural equation methods. Paper presented at annual Academy of Management Conference.Google Scholar
  59. Williams, L. J., Hartman, N., & Cavazotte, F. (2010). Method variance and marker variables: A review and comprehensive CFA marker technique. Organizational Research Methods, 13, 477–514.CrossRefGoogle Scholar
  60. Williams, L. J., & O’Boyle, E. H. (2008). Measurement models for linking latent variables and indicators: A review of human resource management research using parcels. Human Resource Management Review, 18, 233–242.CrossRefGoogle Scholar
  61. Williams, L. J., & O’Boyle, E. H. (2015). Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters. Journal of Applied Psychology, 100(5), 1579–1602.CrossRefPubMedGoogle Scholar
  62. Zickar, M. J. (2015). Digging through dust: Historiography for the organizational sciences. Journal of Business and Psychology, 30, 1–14.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Department of PsychologyUniversity of North DakotaGrand ForksUSA
  2. 2.Department of PsychologyWayne State UniversityDetroitUSA

Personalised recommendations