Abstract
The method factor is usually conceptualized as the unexplained variance due to the use of more than one method to measure a construct. In the context of item keying direction, it often represents unique variance due to the use of reverse-keyed items. Alessandri et al. (Multivariate Behavioral Research, 46(4), 625–642, 2011) found that method factors from self- and peer ratings were moderately correlated with each other, but their investigation was limited to one measurement scale (optimism). In addition, their putative method factor was possibly measuring a substantive construct (pessimism) rather than simple method-related variance. The current study aimed to improve on their investigation by using multiple scales, each of which is theoretically unidimensional. Surprisingly, we replicated their finding that method factors were moderately correlated between the two types of raters. Contrary to Alessandri et al.’s finding, the self-rating method factor was not correlated with personality variables but was correlated with participants’ social desirability, whereas the peer-rating method factor was correlated with closeness to and duration of relationship with participants, but not with social desirability. Therefore, contrary to the explanation by Alessandri et al. (Multivariate Behavioral Research, 46(4), 625–642, 2011), it is unlikely that the method factor from peer ratings is due to social desirability. Our findings cast doubt on the view that cognitive abilities are the only explanation of the method factors. Contrary to the recent argument by Gnambs and Schroeders (Assessment 27(2), 404-441, 2020), method factors may not be a simple “response style artifact”.
Similar content being viewed by others
References
Alessandri, G., Vecchione, M., Tisak, J., & Barbaranelli, C. (2011). Investigating the nature of method factors through multiple informants: Evidence for a specific factor? Multivariate Behavioral Research, 46(4), 625–642. https://doi.org/10.1080/00273171.2011.589272.
Ashton, M. C., & Lee, K. (2009). The HEXACO-60: A short measure of the major dimensions of personality. Journal of Personality Assessment, 91(4), 340–345. https://doi.org/10.1080/00223890902935878.
Ashton, M. C., Lee, K., Goldberg, L. R., & de Vries, R. E. (2009). Higher-order factors of personality: Do they exist? Personality and Social Psychology Review, 13(2), 79–91. https://doi.org/10.1177/1088868309338467.
Asparouhov, T., & Muthén, B. (2009). Exploratory structural equation modeling. Structural Equation Modeling, 16(3), 397–438. https://doi.org/10.1080/10705510903008204.
Bukowski, W. M., Hoza, B., & Boivin, M. (1994). Measuring friendship quality during pre- and early adolescence: The development and psychometric properties of the friendship qualities scale. Journal of Social and Personal Relationships, 11(3), 471–484. https://doi.org/10.1177/0265407594113011.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155–159.
Crowne, D. P., & Marlowe, D. (1960). A new scale of social desirability independent of psychopathology. Journal of Consulting Psychology, 24(4), 349–354. https://doi.org/10.1037/h0047358.
Geiser, C., Bishop, J., & Lockhart, G. (2015). Collapsing factors in multitrait-multimethod models: Examining consequences of a mismatch between measurement design and model. Frontiers in Psychology, 6, Article 946. https://doi.org/10.3389/fpsyg.2015.00946.
Gnambs, T., & Schroeders, U. (2020). Cognitive abilities explain wording effects in the Rosenberg self-esteem scale. Assessment, 27(2), 404–441. https://doi.org/10.1177/1073191117746503.
Jackson, D. N. (1984). Personality research form—Technical manual (version 3.05). London: Sigma Assessment Systems.
Kam, C. C. S. (2016). Why do we still have an impoverished understanding of the item wording effect? An empirical examination. Sociological Methods & Research, 47(3), 574–597. https://doi.org/10.1177/0049124115626177.
Kam, C. C. S. (2018). Testing the assumption of population homogeneity in the measurement of dispositional optimism: Factor mixture modeling analysis. Journal of Personality Assessment. https://doi.org/10.1080/00223891.2018.1502194.
Kam, C. C. S., & Chan, G. H. H. (2018). Examination of the validity of instructed response items in identifying careless respondents. Personality and Individual Differences, 129(1), 83–87. https://doi.org/10.1016/j.paid.2018.03.022.
Kam, C. C. S., & Fan, X. (2018). Investigating response heterogeneity in the context of positively and negatively worded items by using factor mixture modeling. Organizational Research Methods. https://doi.org/10.1177/1094428118790371.
Kam, C. C. S. (2019). Careless responding threatens factorial analytic results and construct validity of personality measure. Frontiers in Psychology, 10, Article 1258. https://doi.org/10.3389/fpsyg.2019.01258.
Kam, C. C. S., & Meyer, J. P. (2015). How careless responding and acquiescence response bias can influence construct dimensionality: The case of job satisfaction. Organizational Research Methods, 18(3), 512–541. https://doi.org/10.1177/1094428115571894.
Lambert, C. E., Arbuckle, S. A., & Holden, R. R. (2016). The Marlowe-Crowne social desirability scale outperforms the BIDR impression management scale for identifying fakers. Journal of Research in Personality, 61, 80–86. https://doi.org/10.1016/j.jrp.2016.02.004.
Lee, K., & Ashton, M. C. (2004). Psychometric properties of the HEXACO personality inventory. Multivariate Behavioral Research, 39(2), 329–358. https://doi.org/10.1207/s15327906mbr3902_8.
Lee, K., & Ashton, M. C. (2018). Psychometric properties of the HEXACO-100. Assessment, 25(5), 543–556. https://doi.org/10.1177/1073191116659134.
Liu, C. (2001). A primary test of the applicability of Marlowe-Crowne’s social desirability scale to Chinese subject. Sociological Research, 1(2), 49–57.
MacCallum, R. C., Browne, M. W., & Sugawara, H. M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1(2), 130–149. https://doi.org/10.1037/1082-989X.1.2.130.
Marsh, H. W. (1996). Positive and negative global self-esteem: A substantively meaningful distinction or artifactors? Journal of Personality and Social Psychology, 70(4), 810–819.
Marshall, G. N., Wortman, C. B., Kusulas, J. W., Hervig, L. K., & Vickers Jr., R. R. (1992). Distinguishing optimism from pessimism: Relations to fundamental dimensions of mood and personality. Journal of Personality and Social Psychology, 62(6), 1067–1074. https://doi.org/10.1037/0022-3514.62.6.1067.
Moshagen, M. (2012). The model size effect in SEM: Inflated goodness-of-fit statistics are due to the size of the covariance matrix. Structural Equation Modeling, 19(1), 86–98. https://doi.org/10.1080/10705511.2012.634724.
Paulhus, D. L. (1984). Two-component models of socially desirable responding. Journal of Personality and Social Psychology, 46(3), 598–609. https://doi.org/10.1037/0022-3514.46.3.598.
Rauch, W. A., Schweizer, K., & Moosbrugger, H. (2007). Method effects due to social desirability as a parsimonious explanation of the deviation of unidimensionality in LOT-R scores. Personality and Individual Differences, 42(8), 1597–1607.
Satorra, A., & Bentler, P. M. (1994). Corrections to test statistics and standard errors in covariance structure analysis. In A. von Eye & C. C. Clogg (Eds.), Latent variables analysis: Applications for developmental research (pp. 399–419). Thousand Oaks: Sage.
Shi, D., Lee, T., & Maydeu-Olivares, A. (2018a). Understanding the model size effect on SEM fit índices. Educational and Psychological Measurement. https://doi.org/10.1177/0013164418783530.
Shi, D., Lee, T., & Terry, R. A. (2018b). Revisiting the model size effect in structural equation modeling. Structural Equation Modeling, 25(1), 21–40. https://doi.org/10.1080/10705511.2017.1369088.
Vazire, S. (2010). Who knows what about a person? The self–other knowledge asymmetry (SOKA) model. Journal of Personality and Social Psychology, 98(2), 281–300. https://doi.org/10.1037/a0017908.
Volk, A. A., Schiralli, K., Xia, X., Zhao, J., & Dane, A. V. (2018). Adolescent bullying and personality: A cross-cultural approach. Personality and Individual Differences, 125(15), 126–132. https://doi.org/10.1016/j.paid.2018.01.012.
Witkin, H. A., Goodenough, D. R., & Oltman, P. K. (1979). Psychological differentiation: Current status. Journal of Personality and Social Psychology, 37(7), 1127–1145.
Funding
The current study was financially supported by Multi-Year Research Grant (MYRG2018–00010-FED) offered by the University of Macau to the first author.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare they have no conflict of interest.
Ethical Approval
The study was in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed Consent
The authors obtained informed consent from all participants in the study.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Kam, C.C.S., Sun, S. Method factor due to the use of reverse-keyed items: Is it simply a response style artifact?. Curr Psychol 41, 1204–1212 (2022). https://doi.org/10.1007/s12144-020-00645-z
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12144-020-00645-z