Skip to main content

An Introduction to Conducting Email Audit Studies

  • Chapter
  • First Online:
Audit Studies: Behind the Scenes with Theory, Method, and Nuance

Part of the book series: Methodos Series ((METH,volume 14))

Abstract

This chapter offers the first general introduction to conducting email audit studies. It provides an overview of the steps involved from experimental design to empirical analysis. It then offers detailed recommendations about email address collection, email delivery, and email analysis, which are usually the three most challenging points of an audit study. The focus here is on providing a set of primarily technical recommendations to researchers who might want to conduct an email audit study. The chapter concludes by suggesting several ways that email audit studies can be adapted to investigate a broader range of social phenomena.

I thank Volha Chykina for her helpful comments. I particularly thank Holger L. Kern for teaching me about audit studies and providing me some of the code used to conduct email audit studies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 119.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    See Gaddis (2018) for a history of audit studies and an overview of the approach.

  2. 2.

    Some recent examples of this include Gaddis (2015); Gaddis and Ghoshal (2015); Sharman (2010); Radicati and Hoang (2011); Oh and Yinger (2015); Milkman et al. (2012, 2015), Lahey and Beasley (2009); Hogan and Berry (2011); Giulietti et al. (2015), Findley et al. (2015), Bushman and Bonacci (2004), Butler (2014), Ahmed et al. (2012, 2013), Baert (2016), and Baert et al. (2016a, b).

  3. 3.

    I acknowledge that there are instances in which researchers cannot or should not implement an audit study over email. Perhaps the biggest reason for this is it might be impossible to collect email addresses for some populations. For instance, it would be very difficult to get email address information for a random sample of Americans. Similarly, one can imagine international contexts, such as many emerging market economies, where it might even be difficult to gather email addresses for public figures, such as government members. In addition to this concern, it is also probably true that some interventions are less plausible over email than through the regular mail or via phone. To the extent that researchers want to maximize the ecological validity of their interventions, they might want to conduct them via alternative means. Yet, despite these limitations, I still think that there are substantial opportunities for conducting additional email audit studies. These opportunities will continue to increase so long as email remains one of the most widely used means of communication.

  4. 4.

    While I focus on using R to address some implementation issues, researchers should be able to accomplish similar tasks in Stata or using other programming languages, such as Python.

  5. 5.

    For ease of exposition, I assume that researchers are implementing a between-subjects design. The general process described in this chapter can be easily adapted to accommodate a within-subjects design. The only potential difficulty in doing this would be in modifying the email delivery script available in the online appendix. I have addressed this issue by modifying the code to deal with both types of design.

  6. 6.

    Lin and Green (2015) provide excellent guidance on some of these decisions.

  7. 7.

    Coffman and Niederle (2015) discusses some of the limitations of pre-analysis plans.

  8. 8.

    I provide an annotated example of this in the online appendix for this chapter.

  9. 9.

    Pedulla (2018) discusses some of the other issues that potentially limit the generalizability of audit study findings.

  10. 10.

    In some cases, researchers might want to randomize the valedictions or salutations. This could be a good idea if scholars are concerned about some actor observing similarities across delivered emails (Butler and Crabtree 2017).

  11. 11.

    I have assumed here that all emails can be delivered in a single wave. This might not be possible depending on the email solution used and the size of the participant pool. One potential problem here is that some servers might limit the number of emails sent in any given 24-hour period. If researchers need to send emails across multiple waves, they will then need to subset their data into different waves prior to implementation and then execute the script for each wave.

References

  • Ahmed, A. M., Andersson, L., & Hammarstedt, M. (2012). Does age matter for employability? A field experiment on ageism in the Swedish labour market. Applied Economics Letters, 19(4), 403–406.

    Article  Google Scholar 

  • Ahmed, A. M., Andersson, L., & Hammarstedt, M. (2013). Are gay men and lesbians discriminated against in the hiring process? Southern Economic Journal, 79(3), 565–585.

    Article  Google Scholar 

  • Baert, S. (2016). Wage subsidies and hiring chances for the disabled: Some causal evidence. The European Journal of Health Economics, 17(1), 71–86.

    Article  Google Scholar 

  • Baert, S., Norga, J., Thuy, Y., & Van Hecke, M. (2016a). Getting grey hairs in the labour market. An alternative experiment on age discrimination. Journal of Economic Psychology, 57, 86–101.

    Article  Google Scholar 

  • Baert, S., De Visschere, S., Schoors, K., Vandenberghe, D., & Omey, E. (2016b). First depressed, then discriminated against? Social Science & Medicine, 170, 247–254.

    Article  Google Scholar 

  • Bertrand, M., & Mullainathan, S. (2004). Are Emily and Greg more employable than Lakisha and Jamal? A field experiment on labor market discrimination. American Economic Review, 94(4), 991–1013.

    Article  Google Scholar 

  • Broockman, D. E. (2013). Black politicians are more intrinsically motivated to advance Blacks’ interests: A field experiment manipulating political incentives. American Journal of Political Science, 57(3), 521–536.

    Article  Google Scholar 

  • Bushman, B. J., & Bonacci, A. M. (2004). You’ve got mail: Using e-mail to examine the effect of prejudiced attitudes on discrimination against Arabs. Journal of Experimental Social Psychology, 40(6), 753–759.

    Article  Google Scholar 

  • Butler, D. M. (2014). Representing the advantaged: How politicians reinforce inequality. New York: Cambridge University Press.

    Book  Google Scholar 

  • Butler, D. M., & Broockman, D. E. (2011). Do politicians racially discriminate against constituents? A field experiment on state legislators. American Journal of Political Science, 55(3), 463–477.

    Article  Google Scholar 

  • Butler, D. M., & Crabtree, C. (2017). Moving beyond measurement: Adapting audit studies to test bias-reducing interventions. Journal of Experimental Political Science, 4, 57–67.

    Article  Google Scholar 

  • Coffman, L. C., & Niederle, M. (2015). Pre-analysis plans have limited upside, especially where replications are feasible. The Journal of Economic Perspectives, 29(3), 81–97.

    Article  Google Scholar 

  • Costa, M. (2017). How responsive are political elites? A meta-analysis of experiments on public offcials. https://doi.org/10.1017/XPS.2017.14

  • Crabtree, C., Golder, M., Gschwend, T., & Indriðason, I. H. (n.d.). Campaign sentiment in European party manifestos (Technical Report Working Paper).

    Google Scholar 

  • Desposato, S. (2015). Ethics and experiments: Problems and solutions for social scientists and policy professionals. London: Routledge.

    Google Scholar 

  • Driscoll, J. (2015). Prison states & games of chicken. In Ethics and experiments: problems and solutions for social scientists and policy professionals. New York: Routledge.

    Google Scholar 

  • Findley, M. G., Nielson, D. L., & Sharman, J. C. (2015). Causes of noncompliance with international law: A field experiment on anonymous incorporation. American Journal of Political Science, 59(1), 146–161.

    Article  Google Scholar 

  • Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), 1502–1505.

    Article  Google Scholar 

  • Fujii, L. A. (2012). Research ethics 101: Dilemmas and responsibilities. PS: Political Science & Politics, 45(04), 717–723.

    Google Scholar 

  • Gaddis, S. M. (2015). Discrimination in the credential society: An audit study of race and college selectivity in the labor market. Social Forces, 93(4), 1451–1479.

    Article  Google Scholar 

  • Gaddis, S. M. (2018). An introduction to audit studies in the social sciences. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

  • Gaddis, S. M., & Ghoshal, R. (2015). Arab American housing discrimination, ethnic competition, and the contact hypothesis. The Annals of the American Academy of Political and Social Science, 660(1), 282–299.

    Article  Google Scholar 

  • Gelman, A., & Hill, J. (2006). Data analysis using regression and multi-level/hierarchical models. New York: Cambridge University Press.

    Book  Google Scholar 

  • Gerber, A. S., & Green, D. P. (2012). Field experiments: Design, analysis, and interpretation. New York: WW Norton.

    Google Scholar 

  • Giulietti, C., Tonin, M., & Vlassopoulos, M. (2015). Racial discrimination in local public services: a field experiment in the US (IZA DP No. 9290 Working paper).

    Google Scholar 

  • Grose, C. R. (2014). Field experimental work on political institutions. Annual Review of Political Science, 17, 355–370.

    Article  Google Scholar 

  • Hauck, R. J. P. (2008). Protecting human research participants, IRBs, and political science Redux: Editor’s introduction. PS: Political Science & Politics, 41(03), 475–476.

    Google Scholar 

  • Heckman, J. J. (1998). Detecting discrimination. The Journal of Economic Perspectives, 12(2), 101–116.

    Article  Google Scholar 

  • Hogan, B., & Berry, B. (2011). Racial and ethnic biases in rental housing: An audit study of online apartment listings. City & Community, 10(4), 351–372.

    Article  Google Scholar 

  • Imai, K., Ratkovic, M., et al. (2013). Estimating treatment effect heterogeneity in randomized program evaluation. The Annals of Applied Statistics, 7(1), 443–470.

    Article  Google Scholar 

  • Lahey, J. N., & Beasley, R. A. (2009). Computerizing audit studies. Journal of Economic Behavior & Organization, 70(3), 508–514.

    Article  Google Scholar 

  • Lin, W., & Green, D. P. (2015). Standard operating procedures: A safety net for pre-analysis plans, Berkeley. Retrieved from www.stat.berkeley.edu/˜winston/sop-safety-net.pdf (2014). Promoting transparency in social science research. Science (New York, NY), 343(6166), 30–31.

  • Lohr, S. (2009). Sampling: Design and analysis. Boston: Nelson Education.

    Google Scholar 

  • Manning, C. D., Surdeanu, M., Bauer, J., Finkel, J. R., Bethard, S. & McClosky, D. (2014). The Stanford CoreNLP natural language processing toolkit. In ACL (System Demonstrations) (pp. 55–60).

    Google Scholar 

  • Milkman, K. L., Akinola, M., & Chugh, D. (2012). Temporal distance and discrimination an audit study in academia. Psychological Science, 23(7), 710–717.

    Article  Google Scholar 

  • Milkman, K. L., Akinola, M., & Chugh, D. (2015). What happens before? A field experiment exploring how pay and representation differentially shape bias on the pathway into organizations. Journal of Applied Psychology, 100(6), 1678.

    Article  Google Scholar 

  • Moore, R. T., & Schnakenberg, K. (2012). BlockTools: Blocking, assignment, and diagnosing interference in randomized experiments. R package Version, 0.5–7. http://rtm.wustl.edu/software.blockTools.htm.

  • Neumark, D., Bank, R. J., & Van Nort, K. D. (1995). Sex discrimination in restaurant hiring: An audit study (Technical Report National Bureau of Economic Research).

    Google Scholar 

  • Oh, S. J., & Yinger, J. (2015). What have we learned from paired testing in housing markets? City, 17(3), 15.

    Google Scholar 

  • Olken, B. A. (2015). Promises and perils of pre-analysis plans. The Journal of Economic Perspectives, 29(3), 61–80.

    Article  Google Scholar 

  • Pager, D., & Shepherd, H. (2008). The sociology of discrimination: Racial discrimination in employment, housing, credit, and consumer markets. Annual Review of Sociology, 34, 181.

    Article  Google Scholar 

  • Pedulla, D. S. (2018). Emerging frontiers in audit study research: mechanisms, variation, and representativeness. In S. M. Gaddis (Ed.), Audit studies: Behind the scenes with theory, method, and nuance. Cham: Springer International Publishing.

    Google Scholar 

  • Pennebaker, J. W. (2015). LIWC: How it works. http://liwc.wpengine.com/how-it-works/

  • Radicati, S., & Hoang, Q. (2011). “Email statistics report, 2011-2015.” Retrieved 25 May 2011.

    Google Scholar 

  • Riach, P. A., & Rich, J. (2002). Field experiments of discrimination in the market place. The Economic Journal, 112(483), F480–F518.

    Article  Google Scholar 

  • Riach, P. A., & Rich, J. (2004). Deceptive field experiments of discrimination: Are they ethical? Kyklos, 57, 457–470.

    Article  Google Scholar 

  • Sharman, J. C. (2010). Shopping for anonymous shell companies: An audit study of anonymity and crime in the international financial system. The Journal of Economic Perspectives, 24(4), 127–140.

    Article  Google Scholar 

  • Stocksdale, M. (2013). E-mail: Not dead, evolving. http://bit.ly/2AdoOMH

  • Suresh, K. (2011). An overview of randomization techniques: An unbiased assessment of outcome in clinical research. Journal of human reproductive sciences, 4(1), 8.

    Article  Google Scholar 

  • Terechshenko, Z., Crabtree, C., Eck, K., & Fariss, C. J. (n.d.). International norms, sanctioning, and prisoners’ rights: A field experiment with foreign missions (Technical report Working Paper).

    Google Scholar 

  • Turner, M. A., Ross, S., Galster, G. C., & Yinger, J. (2002). Discrimination in metropolitan housing markets: National results from phase 1 of the housing discrimination study (HDS) (Technical report).

    Google Scholar 

  • Yanow, D., & Schwartz-Shea, P. (2008). Reforming institutional review board policy: Issues in implementation and field research. PS: Political Science & Politics, 41(03), 483–494.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Charles Crabtree .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Crabtree, C. (2018). An Introduction to Conducting Email Audit Studies. In: Gaddis, S. (eds) Audit Studies: Behind the Scenes with Theory, Method, and Nuance. Methodos Series, vol 14. Springer, Cham. https://doi.org/10.1007/978-3-319-71153-9_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-71153-9_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-71152-2

  • Online ISBN: 978-3-319-71153-9

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics