Skip to main content
Log in

Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames

  • Published:
Environmental and Resource Economics Aims and scope Submit manuscript

Abstract

The increasing use of internet surveys for stated preference studies raises questions about the effect of the survey mode on welfare estimates. A number of studies have conducted convergent validity investigations that compare the use of the internet with other survey implementation modes such as mail, telephone and in-person. All, but one, of these comparison studies is confounded different sample frames for the different modes of survey implementation. In this study we investigate differences in internet and mail survey modes holding the sample frame constant. This is done in the context of a choice-modelling study of improvements in the River Murray in Australia. We also investigate sample frame holding the survey mode (mail) constant. We find that survey mode (internet vs. mail) influences welfare estimates, and the internet welfare estimates are about 78 % of the mail welfare estimates on average. There is not a significant effect of sample frame (internet panel vs. postal addresses) on welfare estimates for implementation of a mail survey.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Bell et al. (2011) also estimated willingness to pay across modes when sociodemographics were held constant. They found with this correction that the Internet Panel ($31) produced a lower value than Mall Intercept ($43) and Phone–Mail ($54), and similar values to sampling at a Central Location ($29).

  2. Attributes and attribute levels were selected through consultation with hydrologists and ecologists. Their knowledge of the river was based on ecosystem modelling and from information contained in publications documenting river conditions (Department of Water, Land and Biodiversity Conservation 2006; Murray-Darling Basin Ministerial Council 2003; Overton and Doody 2008; Paton 2000; Paton et al. 2009).

  3. The MM sample stratification was as follows: the Murray-Darling Basin catchment area (\(\hbox {n}=1\),000), New South Wales (\(\hbox {n}=1\),400), South Australia (\(\hbox {n}=1\),000), Victoria (\(\hbox {n}=1\),200), and the rest of Australia (Queensland, Western Australia, Tasmania and the Northern Territory (\(\hbox {n}=1\),400).

  4. The error component model approximates the nested logit model in accounting for scale differences. Train (2009) states “(a)llowing different variances for the random quantities is analogous to allowing the inclusive value coefficient to differ across nests in the nested logit model” (p. 140). Hensher and Rose (2008) demonstrate this in practice by using the error components model to combine stated-preference and revealed-preference data, which is similar to combing different survey mode data sets in the current paper. The error component model has the advantage offered as per the nested logit model, and also accounts for the panel nature of the response data.

  5. We were not able to investigate sample selection because information on nonrespondents was not available for all samples, e.g., the II treatment. Second, there is not currently a sample-selection estimator, a la Heckman, for categorical response data beyond two responses (a bivariate probit model) and indirect and imperfect approaches must be used to investigate sample selection in choice models (Yuan et al. 2015). This is a topic that deserves consideration in future mode comparison studies.

  6. Raking or sample-balancing is an iterative method for deriving sampling weights on survey data to match multiple characteristics of the population. Where a simple reweighting of the survey data can correct for an imbalance in one characteristic, for example gender, correcting for imbalances across multiple characteristics in the sample generally requires a more complicated set of sample weights (Battaglia et al. 2009).

References

  • American Association for Public Opinion Research (AAPOR) (2006) Standard definitions: final dispositions of case codes and outcome rates for surveys. Retrieved June 18, 2007, from www.aapor.org/uploads/standarddefs_4.pdf

  • Banzhaf S, Burtraw D, Evans D, Krupnick A (2006) Valuation of natural resource improvements in the Adirondacks. Land Econ 82(3):445–464

    Article  Google Scholar 

  • Battaglia Michael P, Hoaglin David C, Frankel Martin R (2009) Practical considerations in raking survey data. Surv Pract 2(5):1–10

    Google Scholar 

  • Bell J, Huber J, Viscusi WK (2011) Survey mode effects on valuation of environmental goods. Int J Environ Res Public Health 8:1222–1243

    Article  Google Scholar 

  • Berrens RP, Bohara AK, Jenkins-Smith H, Silva C (2003) The advent of internet surveys for political research: a comparison of telephone and internet samples. Polit Anal 11(1):1–22

    Article  Google Scholar 

  • Bliemer M, Rose J, Hensher D (2009) Efficient stated choice experiments for estimating nested logit models. Transp Res Part B 43(1):19–35

    Article  Google Scholar 

  • Bliemer M, Rose J (2006) Designing stated choice experiments: the state of the art. In: 11th international conference on travel behaviour research, Kyoto, Japan, 20th August 2006

  • Carmines EG, Zeller RA (1979) Reliability and validity assessment. Sage Publications, Thousand Oaks

    Book  Google Scholar 

  • Canavari M, Nocella G, Scarpa R (2005) Stated willingness-to-pay for organic fruit and pesticide ban. J Food Prod Mark 11(3):107–134

    Article  Google Scholar 

  • Covey J, Robinson A, Jones-Lee M, Loomes G (2010) Responsibility scale and the valuation of rail safety. J Risk Uncertain 40(1):85–108

    Article  Google Scholar 

  • CSIRO (2008) Water availability in the Murray–Darling basin. In: Report to the Australian Government from the CSIRO Murray–Darling Basin Sustainable Yields Project

  • Department of Water, Land and Biodiversity Conservation (2006) Lower Lakes, Coorong and Murray Mouth Asset Environmental Management Plan. Government of South Australia, Adelaide

    Google Scholar 

  • Grandjean BD, Nelson NM, Taylor PA (2009) Comparing an internet panel survey to mail and phone surveys on willingness to pay for environmental quality: a national mode test. In: 64th annual conference of the american association for public opinion research

  • Hatton-MacDonald DH, Morrison MD, Rose JM, Boyle KJ (2011) Valuing a multistate river: the case of the River Murray*. Aust J Agric Resour Econ 55(3):374–392

    Article  Google Scholar 

  • Hensher DA, Rose JM (2008) Combining RP and SP data: biases in using the nested logit ‘trick’—contrasts with flexible mixed logit incorporating panel and scale effects. J Transp Geogr 16(2):126–133

    Article  Google Scholar 

  • Hillman T (2008) Ecological requirements: creating a working river in the Murray–Darling basin. In: Crase L (ed) Water policy in Australia: the impact of change and uncertainty. Resources for the Future, Washington, DC

    Google Scholar 

  • Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2(8): e124. doi:10.1371/journal.pmed.0020124

  • Li H, Berrens R, Bohara A, Jenkins-Smith H, Silva C, Weimer D (2004) Telephone versus Internet samples for a national advisory referendum: are the underlying stated preferences the same? Appl Econ Lett 11(3):173–176

    Article  Google Scholar 

  • Li H, Jenkins-Smith H, Silva C, Berrens R, Herron KG (2009) Public support for reducing U.S. reliance on fossil fuels: integrating household willingness-to-pay for energy resources and development. Ecol Econ 68(3):731–742

    Article  Google Scholar 

  • Lindhjem H, Navrud S (2011a) Are internet surveys an alternative to face-to-face interviews in contingent valuation? Ecol Econ 70(9):1628–1637

    Article  Google Scholar 

  • Lindhjem H, Navrud S (2011b) Using internet in stated preference surveys: a review and comparison of survey modes. Int Rev Environ Resour Econ 5(4):309–351

    Article  Google Scholar 

  • Malhotra N, Krosnick JA (2007) The effect of survey mode and sampling on inferences about political attitudes and behavior: comparing the 2000 and 2004 ANES to internet surveys with nonprobability samples. Polit Anal 15(3):286–323

    Article  Google Scholar 

  • Marta-Pedroso C, Freitas H, Domingos T (2007) Testing for the survey mode effect on contingent valuation data quality: a case study of web based versus in-person interviews. Ecol Econ 62(3–4):388–398

    Article  Google Scholar 

  • Murray–Darling Basin Ministerial Council (2003) Native fish strategy for the Murray–Darling basin 2003–2013. MDBC Publication No. 25/04, Murray–Darling Basin Commission

  • Nielsen J (2011) Use of the internet for willingness-to-pay surveys: a comparison of face-to-face and web-based interviews. Resour Energy Econ 33(1):119–129

    Article  Google Scholar 

  • Olsen S (2009) Choosing between internet and mail survey modes for choice experiment surveys considering non-market goods. Environ Resour Econ 44(4):591–610

    Article  Google Scholar 

  • Overton I, Doody T (2008) Ecosystem changes on the River Murray floodplain over the last 100 years and predictions of climate change. In: Taniguchi M, Burnett W, Fukishima Y, Haigh M, Umezawa Y (eds) From headwaters to the ocean-hydrological changes and watershed management. Taylor and Frances Group, London

    Google Scholar 

  • Paton D (2000) Bird ecology in the Coorong and lakes region. In: Jensen A, Good M, Tucker P, Long M (eds) River murray barrages environmental flows. Murray–Darling Basin Commission, Canberra

    Google Scholar 

  • Paton D, Rogers B, Hill M, Bailey C, Ziembicki M (2009) Temporal changes to spatially stratified waterbird communities of the Coorong, South Australia: implications for the management of heterogeneous wetlands. Anim Conserv 12:408–417

    Article  Google Scholar 

  • Poe G, Giraud K, Loomis J (2005) Computational methods for measuring the difference of empirical distributions. Am J Agric Econ 87(2):353–365

    Article  Google Scholar 

  • Rose J, Bliemer M, Hensher D, Collins AT (2008) Constructing efficient stated choice experiments allowing for differences in error variances across subsets of alternatives. Transp Res Part B 42(4):395–406

    Article  Google Scholar 

  • Shih T-H, Fan X (2009) Comparing response rates in e-mail and paper surveys: a meta-analysis. Educ Res Rev 4(1):26–40

    Article  Google Scholar 

  • Small GW, Moody TD, Siddarth P, Bookheimer SY (2009) Your brain on Google: patterns of cerebral activation during internet searching. Am J Geriatr psychiatry 17(2):116–126

    Article  Google Scholar 

  • Tourangeau R, Ripps LJ, Rasinski KA (2000) The psychology of survey responses. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Train K (2009) Discrete choice methods with simulation. Cambridge University Press, Cambridge

  • Train K (2010) Discrete choice methods with simulation, 2nd edn. Cambridge University Press, Cambridge

    Google Scholar 

  • Vossler CA, Evans MF (2009) Bridging the gap between the field and the lab: environmental goods, policy maker input, and consequentiality. J Environ Econ Manage 58(3):338–345

    Article  Google Scholar 

  • van der Heide CM, van den Bergh JCJM, van Ierland EC, Nunes PALD (2008) Economic valuation of habitat defragmentation: a study of the Veluwe, the Netherlands. Ecol Econ 67(2):205–216

    Article  Google Scholar 

  • Vossler CA, Doyon M, Rondeau D (2012) Truth in consequentiality: theory and field evidence on discrete choice experiments. Am Econ J Microecon 4(4):145–171

    Article  Google Scholar 

  • Windle J, Rolfe J (2011) Comparing responses from internet and paper- based collection methods in more complex stated preference environmental valuation surveys. Econ Anal Policy 41(1):83–97

    Article  Google Scholar 

  • Winter N (2008) SURVWGT: Stata module to create and manipulate survey weights http://econpapers.repec.org/software/bocbocode/s427503.htm

  • Yeager DS, Krosnick JA, Chang LC, Javitz HS, Levendusky MS, Simpser A, Wang R (2011) Comparing the accuracy of RDD telephone surveys and internet surveys conducted with probability and non-probability samples. Unpublished paper, Stanford University

  • Yuan Y, Boyle KJ, You W (2015) Sample selection, individual heterogeneity and regional heterogeneity in valuing farmland conservation easements. Land Econ (forthcoming)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kevin J. Boyle.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Boyle, K.J., Morrison, M., MacDonald, D.H. et al. Investigating Internet and Mail Implementation of Stated-Preference Surveys While Controlling for Differences in Sample Frames. Environ Resource Econ 64, 401–419 (2016). https://doi.org/10.1007/s10640-015-9876-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10640-015-9876-2

Keywords

Navigation