Skip to main content
Log in

Faking Fast and Slow: Within-Person Response Time Latencies for Measuring Faking in Personnel Testing

  • Original Paper
  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Purpose

Item response time (RT) latencies offer a potentially promising approach for measuring faking in personnel testing, but have been studied almost exclusively as either long or short RTs relative to group norms. As such, the ability to reliably assess faking RTs at the individual level remains a challenge. To address this issue, the present study set out to examine the usefulness of a within-person difference score index (DSI) method for measuring faking, in which “control question” (baseline) RTs were compared to “target question” RTs, within single test administrations.

Design/Methodology/Approach

Two hundred six participants were randomly selected to simulated faking or honest testing conditions, and were administered two types of integrity test items (overt and personality), whereby group classification (faking/honest) served as the main dependent variable.

Findings

Faking condition RTs were longer than honest condition RTs for both item types (overt: d = .43; personality: d = .47), and overt item RTs were slightly shorter than personality item RTs in both testing conditions (honest: d = .34; faking: d = .41). Finally, using a sample cut score, the DSI correctly classified an average of 26 % more cases of faking, and 53 % less false positives, compared to the traditional normative method.

Implications

The results suggest that the DSI can be an advantageous method for identifying faking in personnel testing scenarios.

Originality/Value

This is the one of the first studies to propose a practical method for identifying individual-level faking RTs within single test administrations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. Error rates (i.e., false positives and false negatives) are presented as a function of the total sample, as opposed to a function of either the RT measures’ classifications or the testing conditions. As such, the combined errors rates are comparable to the accuracy rates, and the sum of all rates, at any given cut score, equals 100 %.

References

  • Alliger, G. M., & Dwight, S. A. (2000). A meta-analytic investigation of the susceptibility of integrity tests to response distortion. Educational and Psychological Measurement, 60(1), 59–72.

    Article  Google Scholar 

  • Anastasi, A., & Urbina, S. (1997). Psychological testing (7th ed.). New Jersey: Prentice Hall.

    Google Scholar 

  • Barrick, M. R., & Mount, M. K. (2005). Yes, personality matters: Moving on to more important matters. Human Performance, 18, 359–372.

    Article  Google Scholar 

  • Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon’s mechanical turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6(1), 3–5.

    Article  PubMed  Google Scholar 

  • Cullen, M. J., & Sackett, P. R. (2003). Personality and counterproductive work behavior. In M. R. Barrick & A. M. Ryan (Eds.), Personality and work: Reconsidering the role of personality in organizations (pp. 15–182). San Francisco: Jossey-Bass.

    Google Scholar 

  • Dilchert, S., Ones, D. S., Viswesvaran, C., & Deller, J. (2006). Response distortion in personality measurement: born to deceive, yet capable of proving valid self-assessments? Psychology Science, 48(3), 209–225.

    Google Scholar 

  • Donovan, J. J., Dwight, S. A., & Hurtz, G. M. (2003). An assessment of the prevalence, severity, and verifiability of entry-level applicant faking using the randomized response technique. Human Performance, 16, 81–106.

    Article  Google Scholar 

  • Fekken, G. C., & Holden, R. R. (1992). Response latency evidence for viewing personality traits as schema indicators. Journal of Research in Personality, 26, 103–120.

    Article  Google Scholar 

  • Goffin, R. D., & Christiansen, N. D. (2003). Correcting personality tests for faking: A review of popular personality tests and an initial survey of researchers. International Journal of Selection and Assessment, 11(1/2), 340–344.

    Article  Google Scholar 

  • Goldberg, L. R., Johnson, J. A., Eber, H. W., Hogan, R., Ashton, M. C., Cloninger, C. R., et al. (2006). The international personality item pool and the future of public-domain personality measures. Journal of Research in Personality, 40(1), 84–96.

    Article  Google Scholar 

  • Griffith, R. L., Chmielowski, T., & Yoshita, Y. (2007). Do applicants fake? An examination of the frequency of applicant faking behavior. Personnel Review, 36(3), 341–355.

    Article  Google Scholar 

  • Griffith, R. L., & Peterson, M. H. (2006). A closer examination of applicant faking behavior. Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Heggestad, E. D. (2012). A conceptual representation of faking: Putting the horse back in front of the cart. In M. Ziegler, C. MacCann, & R. Roberts (Eds.), New perspectives on faking in personality assessments (pp. 87–101). Oxford: Oxford University Press.

    Google Scholar 

  • Hogan, J., & Brinkmeyer, K. (1997). Bridging the gap between overt and personality-based integrity tests. Personnel Psychology, 50(3), 587–599.

    Article  Google Scholar 

  • Hogan, J., & Hogan, R. (1989). How to measure employee reliability. Journal of Applied Psychology, 74, 273–279.

    Article  Google Scholar 

  • Holden, R. R. (1995). Response latency detection of fakers on personnel tests. Canadian Journal of Behavioural Science, 27(3), 343–355.

    Article  Google Scholar 

  • Holden, R. R., & Book, A. S. (2012). Faking does distort self-report personality assessment. In M. Ziegler, C. MacCann, & R. Roberts (Eds.), New perspectives on faking in personality assessments (pp. 71–86). Oxford, UK: Oxford University Press.

    Google Scholar 

  • Holden, R. R., Fekken, G. C., & Cotton, D. H. G. (1991). Assessing psychopathology using structured test-item response latencies. Psychological Assessment, 3, 111–118.

    Article  Google Scholar 

  • Holden, R. R., Kroner, D. G., Fekken, G. C., & Popham, S. M. (1992). A model of personality test item response dissimulation. Journal of Personality and Social Psychology, 63, 272–279.

    Article  Google Scholar 

  • Holden, R. H., Wood, L. L., & Tomashewski, L. (2001). Do response time limitations counteract the effect of faking on personality inventory validity. Journal of Personality and Social Psychology, 81(1), 160–169.

    Article  PubMed  Google Scholar 

  • Hough, L. M., Eaton, N. K., Dunnette, M. D., Kamp, J. D., & McCloy, R. A. (1990). Criterion-related validities of personality constructs and the effect of response distortion on those validities. {Monograph}. Journal of Applied Psychology, 75, 581–595.

    Article  Google Scholar 

  • Hsu, L. M., Santelli, J., & Hsu, J. R. (1989). Faking detection validity and incremental validity of response latencies to MMPI subtle and obvious items. Journal of Personality Assessment, 53, 278–295.

    Article  Google Scholar 

  • Jensen, A. R. (1985). Methodological and statistical techniques for the chronometric study of mental abilities. In C. R. Reynolds & V. L. Wilson (Eds.), Methodological and statistical advances in the study of individual differences. New York: Plemun.

    Google Scholar 

  • Kaplan, R. M., & Saccuzzo, D. P. (2005). Psychological testing: Principles, applications, and issues (6th ed.). Belmont, CA: Thomson Wadsworth.

    Google Scholar 

  • Kuncle, N. R., & Borneman, M. J. (2007). Toward a new method of detecting deliberately faked personality tests: The use of idiosyncratic item responses. International Journal of Selection and Assessment, 15, 220–231.

    Article  Google Scholar 

  • Levashina, J., Morgeson, F. P., & Campion, M. A. (2009). They don’t do it often, but they do it well: Exploring the relationship between applicant mental abilities and faking. International Journal of Selection and Assessment, 17(3), 271–281.

    Article  Google Scholar 

  • McCrae, R. R., & Costa, P. T. (1983). Social desirability scales: More substance than style. Journal of Consulting and Clinical Psychology, 51, 882–888.

    Article  Google Scholar 

  • McDaniel, M. A., & Timm, T. (1990). Lying takes time: Predicting deception in biodata using response latency. In H. Wing (Chair), Alternative predictors. Symposium presented at the 98th Annual conference of the American Psychological Association, Boston, MA.

  • McFarland, L. A. (2003). Warning against faking on a personality test: Effects on applicant reactions and personality test scores. International Journal of Selection and Assessment, 11, 265–276.

    Article  Google Scholar 

  • McFarland, L. A., & Ryan, A. M. (2000). Variance in faking across non-cognitive measures. Journal of Applied Psychology, 85, 812–821.

    Article  PubMed  Google Scholar 

  • McFarland, L. A., & Ryan, A. M. (2006). Toward an integrated model of applicant faking behavior. Journal of Applied Social Psychology, 36, 979–1016.

    Article  Google Scholar 

  • Mellenbergh, G. J. (1999). A note on simple gain score precision. Applied Psychological Measurement, 22, 87–89.

    Article  Google Scholar 

  • Mueller-Hanson, R., Heggestad, E. D., & Thornton, G. C. (2003). Faking and selection: Considering the use of personality from select-in and select-out perspectives. Journal of Applied Psychology, 88, 348–355.

    Article  PubMed  Google Scholar 

  • Ones, D. S., Viswesvaran, C., & Reiss, A. D. (1996). The role of social desirability in personality testing for personnel selection: The Red Herring. Journal of Applied Psychology, 81, 660–679.

    Article  Google Scholar 

  • Ones, D. S., Viswesvaran, C., & Schmidt, F. L. (1993). Comprehensive meta-analysis of integrity test validities: Finding and implications for personnel selection and theories of job performance. Journal of Applied Psychology, 78, 679–703.

    Article  Google Scholar 

  • Paulhus, D. L. (1984). Two-component models of socially desirable responding. Journal of Personality and Social Psychology, 46, 598–609.

    Article  Google Scholar 

  • Peterson, M. H., Griffith, R. L., Isaacson, J. A., O’Connell, M. S., & Mangos, P. M. (2011). Applicant faking, social desirability, and the prediction of counterproductive work behaviors. Human Performance, 24, 270–290.

    Article  Google Scholar 

  • Reader, M. C., & Ryan, A. M. (2012). Methods for correcting faking. In M. Ziegler, C. MacCAnn, & R. Roberts (Eds.), New perspectives on faking in personality assessments (pp. 131–150). Oxford: Oxford University Press.

    Google Scholar 

  • Robie, C., Brown, D. J., & Beaty, J. C. (2007). Do people fake on personality inventories? A verbal protocol analysis. Journal of Business and Psychology, 21(4), 489–509.

    Article  Google Scholar 

  • Rosse, J. G., Levin, R. A., & Nowicki, M. D. (1999). Assessing the impact of faking on job performance and counter-productive job behaviors. Paper presented in Paul Sackett (Chair) New empirical research on social desirability in personality measurement, symposium for the 14th annual meeting of the Society for Industrial and Organizational Psychology, Atlanta, GA.

  • Rosse, J. G., Stecher, M. D., Miller, J. L., & Levin, R. A. (1998). The impact of response distortion on preemployment personality testing and hiring decisions. Journal of Applied Psychology, 83, 634–644.

    Article  Google Scholar 

  • Ryan, A. M., & Sackett, P. R. (1987). Pre-employment honesty testing: Fakability, reactions of test takers, and company image. Journal of Business and Psychology, 1(3), 248–256.

    Article  Google Scholar 

  • Rynes, S. L., Brown, K. G., & Colbert, A. E. (2002). Seven common misconceptions about human resource practices: Research findings versus practitioner beliefs. Academy of Management Executive, 15(3), 92–103.

    Article  Google Scholar 

  • Sackett, P. R., Burris, L. R., & Callahan, C. (1989). Integrity testing for personnel selection: an update. Personnel Psychology, 42(3), 491–529.

    Article  Google Scholar 

  • Sackett, P., & Lievens, F. (2008). Personnel selection. Annual Review of Psychology, 59, 419–450.

    Article  PubMed  Google Scholar 

  • Salgado, J. F. (1997). The five factor model of personality and job performance in the European Community. Journal of Applied Psychology, 82, 30–43.

    Article  PubMed  Google Scholar 

  • Seymour, T. L., Seifert, C. M., Shafto, M. G., & Mosmann, A. L. (2000). Using response time measures to assess “guilty knowledge”. The Journal of Applied Psychology, 85, 30–37.

    Article  PubMed  Google Scholar 

  • Spence, S. A., Farrow, T. F. D., Herford, A. E., Wilkinson, I. D., Zheng, Y., & Woodruff, P. W. R. (2001). Behavioural and functional anatomical correlates of deception in humans. NeuroReport, 12, 2849–2853.

    Article  PubMed  Google Scholar 

  • van Iddekinge, C. H., Spence, S. A., Roth, P. L., Raymark, P. H., & Odle-Dusseau, H. N. (2012). The critical role of the research question, inclusion criteria, and transparency in meta-analyses of integrity test research: A reply to Harris and Ones, Viswesvaran, and Schmidt (2012). Journal of Applied Psychology, 97(3), 543–549.

    Article  PubMed  Google Scholar 

  • Vendemia, J. M., Buzan, R. F., & Green, E. P. (2005). Practice effects, workload, and reaction time in deception. The American Journal of Psychology, 118, 413–429.

    PubMed  Google Scholar 

  • Viswesvaran, C., & Ones, D. S. (1999). Meta-analyses of fakability estimates: Implications for personality measurement. Educational and Psychological Measurement, 59(2), 197–210.

    Article  Google Scholar 

  • Walczyk, J. J., Mahoney, K. T., Doverspike, D., & Griffith-Ross, D. A. (2009). Cognitive lie detection: Response item and consistency of answers as cues to deception. Journal of Business and Psychology, 24, 33–49.

    Article  Google Scholar 

  • Walczyk, J. J., Roper, K. S., Seemann, E., & Humphrey, A. M. (2003). Cognitive mechanisms underlying lying to questions: Response time as a cue to deception. Applied Cognitive Psychology, 17, 755–774.

    Article  Google Scholar 

  • Walczyk, J. J., Schwartz, J. P., Clifton, R., Adams, B., Wei, M., & Zha, O. (2005). Lying person-to-person about life events: A cognitive framework for lie detection. Personnel Psychology, 58(1), 141–170.

    Article  Google Scholar 

  • Zickar, M. J., & Robie, C. (1999). Modeling faking good on personality items: An item level analysis. Journal of Applied Psychology, 84, 551–563.

    Article  Google Scholar 

  • Zukerman, M., DePaulo, B. M., & Rosenthal, R. (1981). Verbal and nonverbal communication of deception. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 14, pp. 1–59). New York: Academic Press.

    Google Scholar 

Download references

Acknowledgments

The authors would like to thank Prof. Shawn Komar for his significant contribution to earlier research on this topic, as well as Prof. Baruch Nevo for his valuable guidance and cooperation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saul Fine.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fine, S., Pirak, M. Faking Fast and Slow: Within-Person Response Time Latencies for Measuring Faking in Personnel Testing. J Bus Psychol 31, 51–64 (2016). https://doi.org/10.1007/s10869-015-9398-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-015-9398-5

Keywords

Navigation