Skip to main content
Log in

Improving Applicant Reactions to Forced-Choice Personality Measurement: Interventions to Reduce Threats to Test Takers’ Self-Concepts

  • Original Paper
  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Previous research has demonstrated that selection decision making is improved with the use of valid pre-employment assessments, but these assessments can often engender negative reactions on the part of job candidates. Reactions to personality assessments tend to be particularly negative, and these reactions are even worse for forced-choice personality assessments. This latter issue is particularly troubling given the evidence showing that forced-choice measurement is quite effective at reducing deliberate response distortions (i.e., faking). Given the importance organizations place on candidate experience during the recruitment and selection process, improving applicants’ reactions to valid selection assessments is important. Previous research has not, however, discussed the reasons or mechanisms behind why test takers have negative reactions to forced-choice assessments in particular. Here, we propose that forced-choice measurement threatens elements of the test taker’s self-concept thereby engendering negative reactions to the assessment. Based on these theoretical arguments, we develop and test the efficacy of four format variations to a forced-choice assessment to improve test taker reactions. Results suggest that, compared to a traditional/standard forced-choice assessment, test takers reacted more positively to forced-choice assessment formats that (1) used a graded, as opposed to dichotomous, response scale (i.e., allowing for slightly (dis)agree responses); (2) included post-assessment performance feedback; and (3) removed the most socially undesirable items from the test. The theoretical and practical implications of these results are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. We note that there is a debate regarding the prevalence and implications of applicant faking behavior in general (e.g., Griffith et al., 2011) and regarding personality specifically (e.g., Hough & Oswald, 2008). A full review of applicant faking behavior is beyond the scope of this article; interested readers are directed to Griffith et al., (2011).

  2. As noted in the “Method” section, the measure used in this study is computer adaptive. Therefore, removing undesirable items in this case means reducing the item pool, not the number of items test takers completed. In instances wherein the forced-choice measure is not computer adaptive, this format variation may be inappropriate as it will shorten the length of an existing test. As such, if one wishes to remove undesirable items from a non-computer adaptive forced-choice measure, additional items may need to be written.

  3. We note that this, as well as the next, intervention is not specific to forced-choice personality measurement, and may be valuable for other types of pre-employment testing to increase test taker reactions.

  4. Employment status was not collected in this sample. However, given the similar sampling strategies between studies 1 and 2, it can be reasonably assumed that employment rates are relatively the same. As such, we estimate that over 70% of the respondents in study 1 are likely employed.

  5. Repeated testing was used to allow test developers to assess the psychometric properties of the measure

  6. Reactions to feedback measure was not included in the CFA given that only about half of the respondents completed this measure (i.e., those in the feedback provided conditions).

  7. Full CFA results are available from the corresponding author upon request.

  8. Full results for these analyses available from the corresponding author.

  9. For perceptions of appropriateness, a significant instruction formality × feedback interaction was observed. When explored further, instruction type did not significantly influence perceptions within the feedback, F(2, 381) = 1.23, n.s., nor no feedback, F(2, 362) = 2.52, n.s., conditions.

  10. The authors would like to thank an anonymous reviewer for bringing this point to our attentions.

References

  • Alicke, M. D., & Sedikides, C. (2009). Self-enhancement and self-protection: What they are and what they do. European Review of Social Psychology, 20, 1–48.

    Article  Google Scholar 

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Anderson, N., Salgado, J. F., & Hulsheger, U. R. (2010). Applicant reactions in selection: Comprehensive meta-analysis into reaction generalization versus situational specificity. International Journal of Selection and Assessment, 18, 291–304.

    Article  Google Scholar 

  • Barrick, M. R., & Mount, M. K. (1991). The big five personality dimensions and job performance: A meta-analysis. Personnel Psychology, 44, 1–26.

    Article  Google Scholar 

  • Bauer, T. N., Truxillo, D. M., Sanchez, R. J., Craig, J. M., Ferrara, P., & Campion, M. A. (2001). Applicant reactions to selection: Development of the selection procedural justice scale (SPJS). Personnel Psychology, 54, 387–419.

    Article  Google Scholar 

  • Baumeister, R. F., Bratslavsky, E., Finkenauer, C., & Vohs, K. D. (2001). Bad is stronger than good. Review of General Psychology, 5, 323–370.

    Article  Google Scholar 

  • Borman, W. C., Buck, D. E., Hanson, M. A., Motowidlo, S. J., Stark, S., & Drasgow, F. (2001). An examination of the comparative reliability, validity, and accuracy of performance ratings made using computerized adaptive rating scales. Journal of Applied Psychology, 86, 965–973.

    Article  PubMed  Google Scholar 

  • Boyce, A. S., & Capman, J. F. (2017). ADEPT-15 technical report. New York, NY: Aon Assessment Solutions.

    Google Scholar 

  • Brehm, J. W. (1966). A theory of psychological reactance. New York.

  • Brooks, M. E., Dalal, D. K., & Nolan, K. P. (2014). Are common language effect size indicators easier to understand? Journal of Applied Psychology, 99, 332–340.

    Article  PubMed  Google Scholar 

  • Brown, P., & Levinson, S. C. (1987). Politeness: Some universals in language usage. New York, NY: Cambridge University Press.

    Book  Google Scholar 

  • Brown, A., & Maydeu-Olivares, A. (2011). Item response modeling of forced-choice questionnaires. Educational and Psychological Measurement, 71, 460–502.

  • Campion, M. C., Campion, E. D., & Campion, M. A. (2019). Using practice employment tests to improve recruitment and personnel selection outcomes for organizations and job seekers. Journal of Applied Psychology, 104, 1089-1102.

  • Cao, M., & Drasgow, F. (2019). Does forcing reduce faking? A meta-analytic review of forced-choice personality measures in high-stakes situations. Journal of Applied Psychology, 104, 1347-1368.

  • Chapman, D. S., Uggerslev, K. L., Carroll, S. A., Piasentin, K. A., & Jones, D. A. (2005). Applicant attraction to organizations and job choice: A meta-analytic review of the correlates of recruiting outcomes. Journal of Applied Psychology, 90, 928–944.

    Article  PubMed  Google Scholar 

  • Christiansen, N. D., Burns, G. N., & Montgomery, G. E. (2005). Reconsidering forced-choice item formats for applicant personality assessment. Human Performance, 18, 267–307.

    Article  Google Scholar 

  • Christiansen, N. D., Edelstein, S., & Fleming, B. (1998). Reconsidering forced-choice formats for applicant personality assessment. Paper presented at the annual meeting of the Society for Industrial/Organizational Psychology, Dallas, TX.

  • Connelly, B. L., Certo, S. T., Ireland, R. D., & Reutzel, C. R. (2011). Signaling theory: A review and assessment. Journal of Management, 37, 39–67.

    Article  Google Scholar 

  • Converse, P. D., Oswald, F. L., Imus, A., Hedricks, C., Roy, R., & Butera, H. (2008). Comparing personality test formats and warnings: Effects on criterion-related validity and test-taker reactions. International Journal of Selection and Assessment, 16, 155–169.

    Article  Google Scholar 

  • Conway J. S., Boyce, A. S., Caputo, P. M., & Huber, C. (2015). Development of a computer adaptive forced-choice personality test. Poster presented at the 30th Annual Meeting of the Society for Industrial and Organizational Psychology, Philadelphia, PA.

  • Cortina, J. M., & Folger, R. G. (1998). When is it acceptable to accept a null hypothesis: No way, Jose? Organizational Research Methods, 1, 334–350.

    Article  Google Scholar 

  • Cortina, J. M., & Landis, R. S. (2009). When small effect sizes tell a big story, and when large effect sizes don’t. In C. E. Lance & R. J. Vandenberg (Eds.), Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences (pp. 287–308). New York, NY: Routledge.

    Google Scholar 

  • Cunningham, M. R. (1989). Test-taking motivations and outcomes on a standardized measure of on-the-job integrity. Journal of Business and Psychology, 4, 119–127.

    Article  Google Scholar 

  • Dilchert, S., Ones, D. S., Viswesvaran, C., & Deller, J. (2006). Response distortion in personality measurement: Born to deceive, yet capable of providing valid self-assessments? Psychology Science, 48, 209–225.

    Google Scholar 

  • Foldes, H. J., Duehr, E. D., & Ones, D. S. (2008). Group differences in personality: Meta-analyses comparing five U.S. racial groups. Personnel Psychology, 61, 579–616.

    Article  Google Scholar 

  • Gilliland, S. W. (1993). The perceived fairness of selection systems: An organizational justice perspective. Academy of Management Review, 18, 694–734.

    Article  Google Scholar 

  • Goffin, R. D., & Christiansen, N. D. (2003). Correcting personality tests for faking: A review of popular personality tests and an initial survey of researchers. International Journal of Selection and Assessment, 11, 340–344.

    Article  Google Scholar 

  • Griffith, R. L., Lee, L. M., Peterson, M. H., & Zickar, M. J. (2011). First dates and little white lies: A trait contract classification theory of applicant faking behavior. Human Performance, 24, 338–357.

    Article  Google Scholar 

  • Hausknecht, J. P., Day, D. V., & Thomas, S. C. (2004). Applicant reactions to selection procedures: An updated model and meta-analysis. Personnel Psychology, 57, 639–683.

    Article  Google Scholar 

  • Highhouse, S. (2008). Stubborn reliance on intuition and subjectivity in employee selection. Industrial and Organizational Psychology, 1, 333–342.

    Article  Google Scholar 

  • Highhouse, S. (2009). Designing experiments that generalize. Organizational Research Methods, 12, 554–566.

    Article  Google Scholar 

  • Highhouse, S., & Gillespie, J. Z. (2009). Do samples really matter that much? In C. E. Lance & R. J. Vandenberg (Eds.), Statistical and methodological myths and urban legends: Doctrine, verity and fable in the organizational and social sciences (pp. 249–268). New York, NY: Routledge.

    Google Scholar 

  • Huber, C., Conway, J. S., & Boyce, A. S. (2015). Convergent validity of a MDPP CAT for high-stakes personality testing. Poster presented at the 30th Annual Meeting of the Society for Industrial and Organizational Psychology, Philadelphia, PA.

  • Huber, C. R., Capman, J., Boyce, A. S., Lobene, E. V. (2017). Cross-cultural generalization of a multidimensional pairwise preference personality inventory. Presented at the 32nd Annual Meeting of the Society for Industrial and Organizational Psychology, Orlando, FL.

  • Jackson, D. N., Wroblewski, V. R., & Ashton, M. C. (2000). The impact of faking on employment tests: Does forced choice offer a solution? Human Performance, 13, 371–388.

    Article  Google Scholar 

  • Jeong, Y. R., Christiansen, N. D., Robie, C., Kung, M. C., & Kinney, T. B. (2017). Comparing applicants and incumbents: Effects of response distortion on mean scores and validity of personality measures. International Journal of Selection and Assessment, 25, 311–315.

    Article  Google Scholar 

  • McCarthy, J. M., Bauer, T. N., Truxillo, D. M., Anderson, N. R., Costa, A. C., & Ahmed, S. M. (2017). Applicant perspectives during selection: A review addressing “So what?,” “What’s new?,” and “Where to next?”. Journal of Management, 43, 1693–1725.

    Article  Google Scholar 

  • McCloy, R. A., Heggestad, E. D., & Reeve, C. L. (2005). A silk purse from the sow’s ear: Retrieving normative information from multidimensional forced-choice items. Organizational Research Methods, 8, 222–248.

    Article  Google Scholar 

  • McFarland, L. A. (2013). Applicant reactions to personality tests: Why do applicants hate them? In N. Christiansen & R. Tett (Eds.), Handbook of personality at work (pp. 281–298). New York, NY: Routledge.

    Google Scholar 

  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17, 437–455.

  • Mount, M. K., Barrick, M. R., Scullen, S. M., & Rounds, J. (2005). Higher-order dimensions of the big five personality traits and the big six vocational interest types. Personnel Psychology, 58, 447–478.

    Article  Google Scholar 

  • Murphy, K. R. (1986). When your top choice turns you down: Effect of rejected offers on the utility of selection tests. Psychological Bulletin, 99, 133–138.

    Article  Google Scholar 

  • Murphy, K. R., & Davidshofer, C. O. (2005). Psychological testing: Principles and applications (6th edition). Upper Saddle River, New Jersey: Pearson.

  • Noels, K. A., Giles, H., & Le Poire, B. (2003). Language and communication processes. In M. A. Hogg & J. Cooper (Eds.), The SAGE handbook of social psychology (pp. 232–257). Thousand Oaks, CA: SAGE Publications.

    Google Scholar 

  • O'Neill, T. A., Lewis, R. J., Law, S. J., Larson, N., Hancock, S., Radan, J., Lee, N., & Carswell, J. J. (2017). Forced-choice pre-employment personality assessment: Construct validity and resistance to faking. Personality and Individual Differences, 115, 120–127.

    Article  Google Scholar 

  • Oostrom, J. K., Born, M. P., Serlie, A. W., & van der Molen, H. T. (2010). Effects of individual differences on the perceived job relatedness of a cognitive ability test and a multimedia situational judgment test. International Journal of Selection and Assessment, 18, 394–406.

    Article  Google Scholar 

  • Ployhart, R. E., & Harold, C. M. (2004). The applicant attribution-reaction theory (AART): An integrative theory of applicant attributional processing. International Journal of Selection and Assessment, 12, 84–98.

    Article  Google Scholar 

  • Ployhart, R. E., McFarland, L. A., & Ryan, A. M. (2002). Examining applicants’ attributions for withdrawal from a selection procedure. Journal of Applied Social Psychology, 32, 2228–2252.

    Article  Google Scholar 

  • R Core Team. (2013). R: A language and environment for statistical computing. Vienna: R Foundation for Statistical Computing URL http://www.R-project.org/.

    Google Scholar 

  • Revelle, W. (2019). psych: Procedures for psychological, psychometric, and personality research. R package version 1.9.7.

  • Roberts, J. S., Donoghue, J. R., & Laughlin, J. E. (2000). A generalized item response theory model for unfolding unidimensional polytomous responses. Applied Psychological Measurement, 24, 3–32.

    Article  Google Scholar 

  • Rosse, J. G., Miller, J. L., & Stecher, M. D. (1994). A field study of job applicants' reactions to personality and cognitive ability testing. Journal of Applied Psychology, 79, 987–992.

    Article  Google Scholar 

  • Rosse, J. G., Stecher, M. D., Miller, J. L., & Levin, R. A. (1998). The impact of response distortion on preemployment personality testing and hiring decisions. Journal of Applied Psychology, 83, 634–644.

    Article  Google Scholar 

  • Rothstein, M. G., & Goffin, R. D. (2006). The use of personality measures in personnel selection: What does current research support? Human Resource Management Review, 16, 155–180.

    Article  Google Scholar 

  • Ryan, A. M., & Ployhart, R. E. (2000). Applicants’ perceptions of selection procedures and decisions: A critical review and agenda for the future. Journal of Management, 26, 565–606.

    Article  Google Scholar 

  • Rynes, S. L., Bretz Jr., R. D., & Gerhart, B. (1991). The importance of recruitment in job choice: A different way of looking. Personnel Psychology, 44, 487–521.

    Article  Google Scholar 

  • Sackett, P. R., & Lievens, F. (2008). Personnel selection. Annual Review of Psychology, 59, 419–450.

    Article  PubMed  Google Scholar 

  • Schmit, M. J., & Ryan, A. M. (1993). The Big Five in personnel selection: Factor structure in applicant and nonapplicant populations. Journal of Applied Psychology, 78, 966–974.

    Article  Google Scholar 

  • Schwarz, N. (1996). Cognition and communication: Judgmental biases, research methods and the logic of conversation. Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Schwarz, N. (1999). Self-reports: How the questions shape the answers. American Psychologist, 54, 93–105.

    Article  Google Scholar 

  • Sedikides, C., & Gregg, A. P. (2008). Self-enhancement: Food for thought. Perspectives on Psychological Science, 3, 102–116.

    Article  PubMed  Google Scholar 

  • Smither, J. W., Reilly, R. R., Millsap, R. E., Pearlman, K., & Stoffey, R. W. (1993). Applicant reactions to selection procedures. Personnel Psychology, 46, 49–76.

    Article  Google Scholar 

  • Stark, S., Chernyshenko, O. S., & Drasgow, F. (2005). An IRT approach to constructing and scoring pairwise preference items involving stimuli on different dimensions: The multi-unidimensional pairwise-preference model. Applied Psychological Measurement, 29, 184–203.

    Article  Google Scholar 

  • Stark, S., Chernyshenko, O. S., & Drasgow, F. (2006). Detecting differential item functioning with confirmatory factor analysis and item response theory: Toward a unified strategy. Journal of Applied Psychology, 91, 1292–1306.

    Article  PubMed  Google Scholar 

  • Stark, S., Chernyshenko, O. S., & Drasgow, F. (2010). Tailored Adaptive Personality Assessment System (TAPAS-95s). Expanded enlistment eligibility metrics (EEEM): Recommendations on a non-cognitive screen for new soldier selection.

  • Stark, S., Chernyshenko, O. S., Drasgow, F., & White, L. A. (2012). Adaptive testing with multidimensional pairwise preference items: Improving the efficiency of personality and other noncognitive assessments. Organizational Research Methods, 15, 463–487.

    Article  Google Scholar 

  • Steiger, J. H. (1980). Testing pattern hypotheses on correlation matrices: Alternative statistics and some empirical results. Multivariate Behavioral Research, 15, 335–352.

    Article  PubMed  Google Scholar 

  • Sudman, S., Bradburn, N. M., & Schwarz, N. (1996). Thinking about answers: The application of cognitive processes to survey methodology. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Swann Jr., W. B. (1983). Self-verification: Bringing social reality into harmony with the self. In J. Suls & A. G. Greenwald (Eds.), Social psychological perspectives on the self (Vol. 2, pp. 33–66). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Swann Jr., W. B. (1984). Quest for accuracy in person perception: A matter of pragmatics. Psychological Review, 91, 457–477.

    Article  PubMed  Google Scholar 

  • Swann, W. B. (1987). Identity negotiation: Where two roads meet. Journal of Personality and Social Psychology, 53, 1038–1051.

    Article  PubMed  Google Scholar 

  • Tett, R. P., Jackson, D. N., & Rothstein, M. (1991). Personality measures as predictors of job performance: A meta-analytic review. Personnel Psychology, 44, 703–742.

    Article  Google Scholar 

  • The Talent Board (2018a). CANDE Winners. Retrieved from: http://thetalentboard.org/cande-awards/cande-winners/ (Retrieved July 4, 2018).

  • The Talent Board (2018b). The 2018 Talent Board Candidate Experience Award and Benchmark Program. Retrieved from: http://thetalentboard.org/cande-awards/how-to-apply/north-america/ (Retrieved July 4, 2018).

  • Van Hoye, G., & Lievens, F. (2007). Social influences on organizational attractiveness: Investigating if and when word of mouth matters. Journal of Applied Social Psychology, 37, 2024–2047.

    Article  Google Scholar 

  • Van Hoye, G., & Lievens, F. (2009). Tapping the grapevine: A closer look at word-of-mouth as a recruitment source. Journal of Applied Psychology, 94, 341–352.

    Article  PubMed  Google Scholar 

  • Williams, M. L., & Bauer, T. N. (1994). The effect of a managing diversity policy on organizational attractiveness. Group & Organization Management, 19, 295–308.

    Article  Google Scholar 

  • Wortman, C. B., Brehm, J. W., & Berkowitz, L. (1975). Responses to uncontrollable outcomes. Advanced Experimental Social Psychology, 8, 277–336.

    Google Scholar 

  • Zhu, X., Barnes-Farrell, J., & Dalal, D. K. (2015). Stop apologizing for your samples, start embracing them. Industrial and Organizatiosnal Psychology: Perspectives on Science and Practice, 8, 228–232.

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the editor and the anonymous reviewers for their assistance with improving this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dev K. Dalal.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dalal, D.K., Zhu, X.(., Rangel, B. et al. Improving Applicant Reactions to Forced-Choice Personality Measurement: Interventions to Reduce Threats to Test Takers’ Self-Concepts. J Bus Psychol 36, 55–70 (2021). https://doi.org/10.1007/s10869-019-09655-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-019-09655-6

Keywords

Navigation