Environmental and Resource Economics

, Volume 49, Issue 1, pp 47–64 | Cite as

Interview Effects in an Environmental Valuation Telephone Survey

Article

Abstract

Because of the lack of markets for many environmental services, economists have turned to valuation surveys to estimate the value of these services. However, lack of market experience may cause respondents in valuation surveys to be more prone to interview effects than they would be with other opinion surveys. Without reference to market price or experience, respondents are less likely to have well-defined preferences, which may cause respondents to be more easily influenced by the interview process and characteristics of the interviewer. In this paper, we investigate interview effects in a random digit dial telephone survey of recycling valuation and behavior. Following previous research in both psychology and survey methodology, we test the direct effects of interviewer gender and race, as well as the interaction effects between interviewer and respondent characteristics. Using data from 130 interviewers and 1,786 interviewees, we apply a hierarchical regression model that accounts for the clustering of interviews and controls for a variety of other confounding variables. We confirm the existence of both direct and conditional interviewer effects. Respondents state higher willingness to pay when interviewed by white or female interviewers than by non-white or male interviewers. There were also significant interaction effects between interviewer and respondent characteristics. The directions of the interviewer effects are consistent with previous survey research and social psychology theories. We also identify some non-traditional interview process factors that have an influence on survey responses.

Keywords

Interview effects Willingness to pay Recycling Social attribution model 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aadland D, Caplan A (2006) Curbside recycling: waste resource or waste of resources?. J Policy Anal Manag 25(4): 855–874CrossRefGoogle Scholar
  2. Aadland D, Caplan A, Phillips O (2007) A Bayesian examination of information and uncertainty in contingent valuation. J Risk Uncertain 35(2): 149–178CrossRefGoogle Scholar
  3. Arrow K, Solow R, Portney P, Leamer E, Radner R, Shuman H (1993) Report of the NOAA panel on contingent valuation. Available at: www.darp.noaa.gov/library/pdf/cvblue.pdf
  4. Ballou J, Del Boca FK (1980) Gender interaction effects on survey measures in telephone interviewing. American Association for Public Opinion, MasonGoogle Scholar
  5. Barath A, Cannell C (1976) Effect of interviewer’s voice intonation. Public Opin Q 40: 370–373CrossRefGoogle Scholar
  6. Baron R, Kerr N, Miller N (1992) Group process, group decision, group action. Brooks Cole, Pacific GroveGoogle Scholar
  7. Bateman I, Mawby J (2004) First impressions count: a study of the interaction of interviewer appearance and information effects in contingent valuation studies. Ecol Econ 49(1): 47–55CrossRefGoogle Scholar
  8. Baugh J (1991) African-Americans and the environment: a review essay. Policy Stud J 19(2): 182–191CrossRefGoogle Scholar
  9. Benjamini Y, Hochberg Y (1995) Controlling the false discovery rate—a practical and powerful approach to multiple testing. J R Stat Soc Series B (Methodological) 57(1): 289–300Google Scholar
  10. Benney M, Riesman D, Star SA (1956) Age and sex in the interview. Am J Sociol 62(2): 143–152CrossRefGoogle Scholar
  11. Blaine T, Lichtkoppler F, Jones K, Zondag R (2005) An assessment of household willingness to pay for curbside recycling: a comparison of payment card and referendum approaches. J Environ Manag 76: 15–22CrossRefGoogle Scholar
  12. Boyle K, Bishop R (1988) Welfare measurement using contingent valuation: a comparison of techniques. Am J Agric Econ 70: 20–28CrossRefGoogle Scholar
  13. Bradburn N (1983) Response effects. In: Rossi PH, Wright JD, Anderson AB (eds) Handbook of survey research. Academic Press, New YorkGoogle Scholar
  14. Brookshire D, Thayer M, Schulze W, d’Arge R (1982) Valuing public goods: a comparison of survey and hedonic approaches. Am Econ Rev 72: 165–177Google Scholar
  15. Callegaro M, De Keulenaer F, Krosnick J, Daves R (2006) Interviewer Effect in an RDD Telephone Pre-election Poll in Minneapolis 2001, An Analysis of the Race and Gender Interviewer Effect. In the 2005 Proceedings of the American Statistical Association, 60th Annual Conference of the American Association for Public Opinion Research [CD-ROM]. American Statistical Association, Alexandria, VA, pp 3815–3821Google Scholar
  16. Carson R, Mitchell R (1993) The value of clean water: the public’s willingness to pay for boatable, fishable and swimmable quality water. Water Resour Res 29: 2445–2454CrossRefGoogle Scholar
  17. Carson R, Mitchell R, Hanemann W, Kopp R, Presser S, Ruud P (1992) A contingent valuation study of lost passive use values resulting from the Exxon Valdez oil spill, report to the Attorney General of Alaska. Natural Resource Damage Assessment Inc., La JollaGoogle Scholar
  18. Clarke P, Sproston K, Thomas R (2003) An investigation into expectation-led interviewer effects in health surveys. Soc Sci Med 56: 2221–2228CrossRefGoogle Scholar
  19. Cotter P, Cohen J, Coulter P (1982) Race-of-interviewer effects in telephone interviews. Public Opin Q 46: 278–284CrossRefGoogle Scholar
  20. Davis D (1997) Nonrandom measurement error and race of interviewer effects among African Americans. Public Opin Q 61(1): 183–207CrossRefGoogle Scholar
  21. Desvousges W, Smith V, Fisher A (1987) Option price estimates for water quality improvements: a contingent valuation study for the Monongahela River. J Environ Econ Manag 14: 248–267CrossRefGoogle Scholar
  22. Diamantopoulos A, Schlegelmilch B, Sinkovics R, Bohlen G (2003) Can socio-demographics still play a role in profiling green consumers? A review of the evidence and an empirical investigation. J Bus Res 56: 465–480CrossRefGoogle Scholar
  23. Dijkstra W (1987) Interviewing style and respondent behavior: an experimental study of the survey-interview. Sociol Methods Res 16: 309–334CrossRefGoogle Scholar
  24. Eagly A (1978) Sex differences in influenceability. Psychol Bull 85: 86–116CrossRefGoogle Scholar
  25. Environmental Protection Agency U.S. (2009) Municipal solid waste generation, recycling, and disposal in the United States: detailed Tables and Figures for 2008, (www.epa.gov/epawaste/nonhaz/municipal/pubs/msw2008data.pdf).
  26. Fendrich M, Johnson T, Shalligram C, Wislar J (1999) The impact of interviewer characteristics on drug use reporting by male juvenile arrestees. J Drug Issues 29(1): 37–58Google Scholar
  27. Finkel S, Guterbock T, Borg M (1991) Race-of-interviewer effects in pre-election poll: Virginia 1989. Public Opin Q 55(3): 313–330CrossRefGoogle Scholar
  28. Greene W (2001) Econometric analysis. 5th ed. Prentice Hall, Upper Saddle RiverGoogle Scholar
  29. Groves R, Kahn R (1979) Surveys by telephone: a national comparison with personal interviews. Academic Press, New YorkGoogle Scholar
  30. Groves R, Fultz N (1985) Gender effects among telephone interviewers in a survey of economic attitudes. Sociol Methods Res 14: 31–52CrossRefGoogle Scholar
  31. Hersch J, Viscusi W (2006) The generational divide in support for environmental policies: European evidence. Clim Change 77: 121–136CrossRefGoogle Scholar
  32. Hong S, Adams R, Love A (1993) An economic analysis of household recycling of solid wastes: the Case of Portland, Oregon. J Environ Econ Manag 25: 136–146CrossRefGoogle Scholar
  33. Hox J (1994) Hierarchical regression models for interviewer and respondent effects. Sociol Methods Res 22(3): 300–318CrossRefGoogle Scholar
  34. Johnson T, Fendrich M, Shaligram C, Garcy A, Gillespie S (2000) An evaluation of the effects of interviewer characteristics in an RDD telephone survey of drug use. J Drug Issues 30(1): 77–102Google Scholar
  35. Kahneman D, Sugden R (2005) Experienced utility as a standard of policy evaluation. Environ Resour Econ 32: 161–181CrossRefGoogle Scholar
  36. Kinnaman T (2001) Explaining household demand for the collection of solid waste and recycling. In: Fullerton D, Kinnaman T (eds) The economics of household garbage and recycling behavior. Edward Elgar, CheltenhamGoogle Scholar
  37. Lake I, Bateman I, Parfitt J (1996) Assessing a kerbside recycling scheme: a quantitative and willingness to pay case study. J Environ Econ Manag 46: 239–254Google Scholar
  38. Landis J, Sullivan D, Sheley J (1973) Feminist attitudes as related to sex of the interviewer. Pac Sociol Rev 16: 305–314Google Scholar
  39. Lass N, Tecca J, Mancuso R, Black W (1979) The effect of phonetic complexity on speaker race and sex identification. J Phon 5: 105–118Google Scholar
  40. Lee T (2001) Language-of-interviewer effects and latino mass opinion. Working paper John F. Kennedy (ed) School of Government, Harvard UniversityGoogle Scholar
  41. Loomis J, Ellingson L, González-Cabán A, Seidl A (2006) The role of ethnicity and language in contingent valuation analysis: a fire prevention policy application. Am J Econ Sociol 65(3): 559–586CrossRefGoogle Scholar
  42. Loureiro M, Lotade J (2005) Interviewer effects on the valuation of goods with ethical and environmental attributes. J Environ Res Econ 30(1): 49–72CrossRefGoogle Scholar
  43. McFadden D (2001) Economic choices. Am Econ Rev 91(3): 351–378CrossRefGoogle Scholar
  44. McKenzie J (1977) An investigation into interviewer effects in market research. J Mark Res 14: 330–336CrossRefGoogle Scholar
  45. Mitchell R, Carson R (1989) Using surveys to value public goods: the contingent valuation method. Resources for the Future, WashingtonGoogle Scholar
  46. Newell SJ, Green CL (1997) Racial differences in consumer environmental concern. J Consum Aff 31(1): 53–70CrossRefGoogle Scholar
  47. Norris D, Hatcher J (1994) The impact of interviewer characteristics on response in a national survey of violence against women. In: Proceedings of the survey research methods section. American Statistics Association (www.amstat.org/sections/srms/Proceedings/).
  48. Schejbal J, Sachs H, Lavrakas P (1993) Hello do you remember us? What respondents remember about the interview. Paper presented at the Annual Conference of the Midwest Association for Public Opinion Research (MAPOR). ChicagoGoogle Scholar
  49. Schuman H, Converse J (1971) The effects of black and white interviewers on black responses in 1968. Public Opin Q 35: 44–68CrossRefGoogle Scholar
  50. Singer E, Frankel M, Glassman M (1983) The effect of interviewer characteristics and expectations on response. Public Opin Q 47: 68–83CrossRefGoogle Scholar
  51. Tucker C (1983) Interviewer effects in telephone surveys. Public Opin Q 47: 84–95CrossRefGoogle Scholar
  52. Udow A (1942) The ‘Interviewer-Effect’ in public opinion and market research surveys. Arch Psycholo 277: 1–36Google Scholar
  53. Walsh R, Bjonback R, Aiken R, Rosenthal D (1990) Estimating the public benefits of protecting forest quality. J Environ Manag 30: 175–189CrossRefGoogle Scholar
  54. Williams J Jr (1964) Interviewer-respondent interaction: a study of bias in the information interview. Sociometry 27: 338–352CrossRefGoogle Scholar
  55. Williams J Jr (1968) Interviewer role performance: a future note on bias in the information interview. Public Opin Q 32: 287–294CrossRefGoogle Scholar
  56. Wilson D, Olesen E (2002) Perceived race of interviewer effects in telephone interviews. Paper presented at the 57th AAPOR/WAPOR conference, St. Pete BeachGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.Center for Research on Environmental DecisionsColumbia UniversityNew YorkUSA
  2. 2.Department of Economics and FinanceUniversity of WyomingLaramieUSA

Personalised recommendations