Skip to main content

Technical Aspects of Correspondence Studies

  • Chapter
  • First Online:
Audit Studies: Behind the Scenes with Theory, Method, and Nuance

Part of the book series: Methodos Series ((METH,volume 14))

Abstract

This chapter discusses technical concerns and choices that arise when crafting a correspondence or audit study using external validity as a motivating framework. The chapter discusses resume creation, including power analysis , choice of inputs, pros and cons of matching pairs, solutions to the limited template problem, and ensuring that instruments indicate what the experimenters want them to indicate. Further topics about implementation include when and for how long to field a study, deciding on a participant pool, and whether or not to use replacement from the participant pool . More technical topics include matching outcomes to inputs, data storage, and analysis issues such as when to use clustering, when not to use fixed effects, and how to measure heterogeneous and interactive effects. The chapter ends with a technical checklist that experimenters can utilize prior to fielding a correspondence study.

This chapter has been prepared for the volume Audit Studies: Behind the Scenes with Theory, Method, and Nuance, edited by S. Michael Gaddis. The authors thank all of the researchers who have provided feedback on the Resume Randomizer program, and Joanna Lahey also thanks the many editors who, through referee requests, have forced her to keep up-to-date on the state of correspondence studies. Thanks also to Patrick Button, S. Michael Gaddis, R. Alan Seals, Jill E. Yavorsky, and an anonymous reviewer for helpful feedback.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 119.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    External validity here is defined as results from the experiment being generalizable to other populations and settings (as in Stock and Watson 2011).

  2. 2.

    Note that, as always, you should check with your IRB about what job sites are allowable based on their Terms of Service (TOS). Some IRB allow TOS violations that could happen in the normal course of use, whereas others do not allow such usage.

  3. 3.

    Available at http://www.nber.org/data/ (under “Other”), at https://github.com/beaslera/resumerandomizer, or from the authors by request.

  4. 4.

    The simplest non-trivial template file might be:

    • 24 gui version number

    • *constant* 1 1

    • *random* 1-1 2 *matchDifferent*

    • *leaf* 1-1-1

    • John

    • *end_leaf* 1-1-1

    • *leaf* 1-1-2

    • Jane

    • *end_leaf* 1-1-2

    • *end_random* 1-1 2

    • *end_constant* 1 1

    which defines a single slot into which will go either “John” or “Jane” in the correspondence, and which appears in the user-interface as two text boxes that each contain one of the names plus drop-down boxes for various options. Example template files are distributed with the program, and the HTML user-interface has buttons that can load sixteen examples of templates, e.g., https://raw.githubusercontent.com/beaslera/resumerandomizer/master/example_cover_letter_template.rtf

  5. 5.

    See Pager and Pedulla (2015) for more information on how perceived discrimination affects job application behavior.

  6. 6.

    Stata’s currently supported sample size calculator is power, but as of this writing has limited options compared to G*Power and thus is only recommended for simple designs, although its nratio option is useful for unbalanced designs.

    G*Power is available for free from http://www.gpower.hhu.de/en.html and is available for both Mac and Windows.

  7. 7.

    Thanks to R. Alan Seals for this suggestion. He also notes that there is room for a methodology paper on best practices for finding sample sizes in audit studies.

  8. 8.

    Note that researchers using their own domain, such as those from hostgator, can quickly create hundreds of email addresses all with the same passwords and settings, facilitating exact matches when responses come via email. Voicemail matches are more difficult. Neumark et al. (2016) populated voicemail bins such that each voicemail only had one version of each first name and last name used, which helped with matching. “So if a bin got a call, and they said, ‘Hi Jennifer, we’d like to interview you,’ then we knew the exact applicant since there was only one Jennifer in that bin,” (personal communication, Patrick Button, October 20, 2016). R. Alan Seals (personal communication, November 13, 2016) recommends using Google Voice to transcribe phone messages from employers for easy text analysis.

  9. 9.

    In order to reduce the burden on companies, it is common for experimenters to respond to firms that the employee has taken another job after being contacted for an interview during this step.

  10. 10.

    R. Alan Seals (personal communication, November 13, 2016) notes that if you save prompts electronically as webpages, it is important that workers all use the same web browser to facilitate text scraping.

References

  • Agan, A. Y., & Starr, S. B. (2016). Ban the box, criminal records, and statistical discrimination: A field experiment (U of Michigan Law & Econ Research Paper No. 16–012). Available at SSRN: https://ssrn.com/abstract=2795795

  • Baert, S., & Dieter, V. (2014). Unemployment or overeducation: Which is a worse signal to employers? (No. 8312). Institute for the Study of Labor (IZA).

    Google Scholar 

  • Barron, J. M., Bishop, J., & Dunkelberg, W. C. (1985). Employer search: The interviewing and hiring of new employees. The Review of Economics and Statistics, 67(1), 43–52.

    Article  Google Scholar 

  • Bendick Jr, M., Brown, L. E., & Wall, K. (1999). No foot in the door: An experimental study of employment discrimination against older workers. Journal of Aging & Social Policy, 10(4), 5–23.

    Article  Google Scholar 

  • Bertrand, M., & Duflo, E. (2016). Field experiments on discrimination via National Bureau of Economic Research. http://www.nber.org/papers/w22014. Accessed 13 Oct 2016.

  • Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. The American Economic Review, 94(4), 991–1013.

    Article  Google Scholar 

  • Brewer, M. (2000). Research design and issues of validity. In H. Reis & C. Judd (Eds.), Handbook of research methods in social and personality psychology. Cambridge: Cambridge University Press.

    Google Scholar 

  • Cameron, A. C., & Miller, D. L. (2015). A practitioner’s guide to cluster-robust inference. Journal of Human Resources, 50(2), 317–372.

    Article  Google Scholar 

  • Carbonaro, W., & Schwarz, J. (2018). Opportunities and challenges in designing and conducting a labor market resume study. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

  • Charness, G., Gneezy, U., & Kuhn, M. A. (2012). Experimental methods: Between-subject and within-subject design. Journal of Economic Behavior & Organization, 81(1), 1–8.

    Article  Google Scholar 

  • Chehras, N. (2017). Automating correspondence study applications with python and SQL: Guide and code. Mimeo.

    Google Scholar 

  • Cohen, J. (1977). Statistical power analysis for the behavioral sciences. New York: Academic.

    Google Scholar 

  • Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155–159.

    Article  Google Scholar 

  • Crabtree, C. (2018). An introduction to conducting email audit studies. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

  • Darolia, R., Koedel, C., Martorell, P., et al. (2015). Do employers prefer workers who attend for-profit colleges? Evidence from a field experiment. Journal of Policy Analysis and Management, 34(4), 891–903.

    Article  Google Scholar 

  • Deming, D. J., Yuchtman, N., Abulafi, A., et al. (2016). The value of postsecondary credentials in the labor market: An experimental study. The American Economic Review, 106(3), 778–806.

    Article  Google Scholar 

  • Friedman, S., Reynolds, A., & Scovill, S. (2013). An estimate of housing discrimination against same-sex couples. Available via the US Department of Housing and Urban Development: http://big.assets.huffingtonpost.com/hud.pdf

  • Gaddis, S. M. (2015). Discrimination in the credential society: An audit study of race and college selectivity in the labor market. Social Forces, 93(4), 1451–1479.

    Article  Google Scholar 

  • Gaddis, S. M. (2017a). How black are Lakisha and Jamal? Racial perceptions from names used in correspondence audit studies. Sociological Science, 4, 469–489.

    Article  Google Scholar 

  • Gaddis, S. M. (2017b). Racial/ethnic perceptions from Hispanic names: Selecting names to test for discrimination. Socius, 3, 1–11.

    Article  Google Scholar 

  • Gulliford, M. C., Obioha, U. C., & Chinn, S. (1999). Components of variance and intraclass correlations for the design of community-based surveys and intervention studies: Data from the Health Survey for England 1994. American Journal of Epidemiology, 149(9), 876–883.

    Article  Google Scholar 

  • Harrison, G. W., & List, J. A. (2004). Field experiments. Journal of Economic Literature, 42(4), 1009–1055.

    Article  Google Scholar 

  • Heckman, J. J. (1998). Detecting discrimination. The Journal of Economic Perspectives, 12(2), 101–116.

    Article  Google Scholar 

  • Holzer, H. J. (1996). What employers want: Job prospects for less-educated workers. New York: Russell Sage Foundation.

    Google Scholar 

  • Horton, J. J. (forthcoming). The effects of algorithmic labor market recommendations: Evidence from a field experiment. Journal of Labor Economics.

    Google Scholar 

  • Howden, D. (2016). Interviews per hire: Recruiting KPIs. Available via Workable. https://resources.workable.com/blog/interviews-per-hire-recruiting-metrics/. Accessed 13 Oct 2016.

  • Jobs Opening and Labor Turnover Survey (JOLTS). (2016). Bureau of labor statistics. http://www.bls.gov/jlt/home.htm. Accessed 13 Oct 2016.

  • Kang, S. K., Decelles, K. A., Tilcsik, A., et al. (2016). Whitened résumés: Race and self-presentation in the labor market. Administrative Science Quarterly, 61(3), 469–502.

    Article  Google Scholar 

  • Lahey, J. N. (2008). Age, women, and hiring an experimental study. Journal of Human Resources, 43(1), 30–56.

    Article  Google Scholar 

  • Lahey, J. N., & Beasley, R. A. (2009). Computerizing audit studies. Journal of Economic Behavior & Organization, 70(3), 508–514.

    Article  Google Scholar 

  • Lahey, J. N., & Oxley, D. (2016). Discrimination at the intersection of age, race, and gender: Evidence from a lab-in-the-field experiment. Working Paper.

    Google Scholar 

  • Lanning, J. A. (2013). Opportunities denied, wages diminished: Using search theory to translate audit-pair study findings into wage differentials. BE Journal of Economic Analysis and Policy, 13(2), 921–958.

    Google Scholar 

  • Maas, C. J., & Hox, J. J. (2005). Sufficient sample sizes for multilevel modeling. Methodology, 1(3), 86–92.

    Article  Google Scholar 

  • Maurer, R. (2016). More employers moving to fewer interviews. Available via Society for Human Resources Management. Accessed 13 Oct 2016.

    Google Scholar 

  • Moynihan, L. M., Roehling, M. V., LePine, M. A., et al. (2003). A longitudinal study of the relationships among job search self-efficacy, job interviews, and employment outcomes. Journal of Business and Psychology, 18(2), 207–233.

    Article  Google Scholar 

  • Neumark, D. (2012). Detecting discrimination in audit and correspondence studies. Journal of Human Resources, 47(4), 1128–1157.

    Article  Google Scholar 

  • Neumark, D. (2016). Experimental research on labor market discrimination. NBER working paper series. http://www.nber.org/papers/w22022. Accessed 17 Oct 2016.

  • Neumark, D., Burn, I., & Button, P. (2016). Evidence from lab and field experiments on discrimination: Experimental age discrimination evidence and the Heckman critique. The American Economic Review, 106(5), 303–308.

    Article  Google Scholar 

  • Olian, J. D., Schwab, D. P., & Haberfeld, Y. (1988). The impact of applicant gender compared to qualifications on hiring recommendations: A meta-analysis of experimental studies. Organizational Behavior and Human Decision Processes, 41(2), 180–195.

    Article  Google Scholar 

  • Olken, B. A. (2015). Promises and perils of pre-analysis plans. The Journal of Economic Perspectives, 29(3), 61–80.

    Article  Google Scholar 

  • Oreopoulos, P. (2011). Why do skilled immigrants struggle in the labor market? A field experiment with thirteen thousand resumes. American Economic Journal: Economic Policy, 3(4), 148–171.

    Google Scholar 

  • Pager, D., & Pedulla, D. S. (2015). Race, self-selection, and the job search process. American Journal of Sociology, 120(4), 1005–1054.

    Article  Google Scholar 

  • Pedulla, D. S. (2018). Emerging frontiers in audit study research: mechanisms, variation, and representativeness. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

  • Phillips, C. (2016). Do comparisons of fictional applicants measure discrimination when search externalities are present? Evidence from existing experiments. Working Paper.

    Google Scholar 

  • Porter, N. D., Verdery, A. M., & Gaddis, S. M. (2017). Enhancing big data in the social sciences with crowdsourcing: Data augmentation practices, techniques, and opportunities. Available at SSRN: https://ssrn.com/abstract=2844155

  • Rooth, D. O. (2010). Automatic associations and discrimination in hiring: Real world evidence. Labour Economics, 17(3), 523–534.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Belmont: Wadsworth Cengage Learning.

    Google Scholar 

  • Stock, J. H., & Watson, M. W. (2011). Introduction to econometrics (3rd ed.). Boston: Addison-Wesley.

    Google Scholar 

  • Trochim, W. M. K., & Donnelly, J. P. (2006). Research methods knowledge base. Mason: Atomic Dog/Cengage Learning.

    Google Scholar 

  • Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453–458.

    Article  Google Scholar 

  • Vuolo, M., Uggen, C., & Lageson, S. (2018). To match or not to match? Statistical and substantive considerations in audit design and analysis. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joanna Lahey .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Lahey, J., Beasley, R. (2018). Technical Aspects of Correspondence Studies. In: Gaddis, S. (eds) Audit Studies: Behind the Scenes with Theory, Method, and Nuance. Methodos Series, vol 14. Springer, Cham. https://doi.org/10.1007/978-3-319-71153-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-71153-9_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-71152-2

  • Online ISBN: 978-3-319-71153-9

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics