Social Indicators Research

, Volume 141, Issue 3, pp 931–957 | Cite as

Social Indicators to Explain Response in Longitudinal Studies

  • Annamaria BianchiEmail author
  • Silvia Biffignandi


Economic and social studies use longitudinal panels to estimate change in variables and aggregates of interest. Attrition in such studies may threaten the validity of the estimates from the panels. This study deepens the knowledge on attrition making reference to three waves of the UK Household Longitudinal Study. While traditionally participation behaviour in panel surveys has been mostly studied with reference to socio-demographic variables and not distinguishing different components of the response process, the focus here is on the role of social indicators and personality traits in explaining contact and cooperation, beyond demographic variables. Findings show that some indicators of community attachment affect the likelihood of making contact with members of the panel and indicators of social participation are significant in explaining cooperation given contact. Personality factors and well-being related variables turn out not to be significant factors.


Non-response Attrition Panel surveys Big-Five Social participation Well-being 



The paper is supported by the 60% University of Bergamo, Bianchi and Biffignandi grant. The authors are grateful for comments from Peter Lynn and from the referees. Understanding Society is an initiative funded by the Economic and Social Research Council and various Government Departments, with scientific leadership by the Institute for Social and Economic Research, University of Essex, and survey delivery by NatCen Social Research and Kantar Public. The research data are distributed by the UK Data Service.


  1. American Association for Public Opinion Research (AAPOR). (2015). Standard definitions: Final dispositions of case codes and outcome rates for surveys (8th ed.). AAPOR.Google Scholar
  2. Bayliss, D., Olsen, W., & Walthery, P. (2017). Well-being during recession in the UK. Applied Research in Quality of Life, 12, 369–387.CrossRefGoogle Scholar
  3. Beher, A., Bellgardt, E., & Rendtel, U. (2005). Extent and determinants of panel attrition in the European Community Household Panel. European Sociologica Review, 21, 489–512.CrossRefGoogle Scholar
  4. Bianchi, A., & Biffignandi, S. (2014). Responsive design for economic data in mixed-mode panels. In P. L. Conti, F. Mecatti, & M. G. Ranalli (Eds.), Contribution to sampling statistics. Springer Series Contributions to Statistiscs (pp. 85–102). Cham: Springer.Google Scholar
  5. Bianchi, A., & Biffignandi, S. (2017a). Representativeness in panel surveys. Mathematical Population Studies, 24(2), 126–143.CrossRefGoogle Scholar
  6. Bianchi, A., & Biffignandi, S. (2017b). Targeted letters: Effects on sample composition and item non-response. Statistical Journal of the International Association for Official Statistics, 33, 459–467.Google Scholar
  7. Bianchi, A., Biffignandi, S., & Lynn, P. (2017). Web-face-to-face mixed-mode design in a longitudinal survey: Effects on participation rates, sample composition, and costs. Journal of Official Statistics, 33(2), 385–408.CrossRefGoogle Scholar
  8. Buck, N., & McFall, S. (2012). Understanding society: Design overview. Longitudinal and Life Couse Studies, 31, 5–17.Google Scholar
  9. Couper, M. P., & Ofstedal, M. B. (2009). Keeping in contact with mobile sample members. In P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 183–203). Chichester: Wiley.CrossRefGoogle Scholar
  10. De Leeuw, E. D., & De Heer, W. (2002). Trends in household survey nonresponse: A longitudinal and international comparison. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey nonresponse (pp. 41–54). New York: Wiley.Google Scholar
  11. Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web surveys: A systematic review. Computers in Human Behavior, 26, 132–139.CrossRefGoogle Scholar
  12. Farrant, G., & O’Muircheartaigh, C. (1991). Components of nonresponse bias in the British election surveys. In A. Heath, J. Curtice, R. Jowell, S. Evans, J. Field, & S. Witherspoon (Eds.), Understanding political change (pp. 235–249). London: Pergamon Press.CrossRefGoogle Scholar
  13. Goldberg, L. R. (1981). Language and individual differences: The search for universals in personality lexicons. In L. Wheeler (Ed.), Review of personality and social psychology (pp. 141–165). Beverly Hills, CA: Sage.Google Scholar
  14. Goldberg, L. R. (1990). An alternative “description of personality”: The Big-Five factor structure. Journal of Personality and Social Psychology, 59, 1216–1229.CrossRefGoogle Scholar
  15. Gray, R., Campanelli, P., Deepchand, K., & Prescott-Clarke, P. (1996). Exploring survey nonresponse: The effect of attrition on a follow-up of the 1984–85 health and life style survey. The Statistician, 45, 163–183.CrossRefGoogle Scholar
  16. Groves, R. M., & Couper, M. P. (1998). Nonresponse in household interview surveys. New York: Wiley.CrossRefGoogle Scholar
  17. Groves, R. M., & Heeringa, S. G. (2006). Responsive design for household surveys: Tools for actively controlling survey errors and costs. Journal of the Royal Statistical Society: Series A, 169, 439–457.CrossRefGoogle Scholar
  18. Jäckle, A., Lynn, P., & Burton, J. (2015). Going online with a face-to-face household panel: Effects of a mixed mode design on item and unit non-response. Survey Research Methods, 9(1), 57–70.Google Scholar
  19. John, O. P., & Srivastava, S. (1999). The Big Five trait taxonomy: History, measurement, and theoretical perspectives. In O. P. John & L. A. Pervin (Eds.), Handbook of personality: Theory and research. New York: Guilford Press.Google Scholar
  20. Kalton, G., Lepkowski, J., Montanari, G. E., & Maligalig, D. (1990). Characteristics of second wave nonrespondents in a panel survey. In Proceedings of the American Statistical Association, survey research methods section (pp. 462–467). Washington, DC: American Statistical Association. Accessed 08 Mar 2018.
  21. Laurie, H., & Lynn, P. (2009). The use of respondent incentives on longitudinal surveys. In P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 205–233). Chichester: Wiley.CrossRefGoogle Scholar
  22. Laurie, H., Smith, R., & Scott, L. (1999). Strategies for reducing nonresponse in a longitudinal panel survey. Journal of Official Statistics, 15, 269–282.Google Scholar
  23. Lepkowski, J. M., & Couper, M. P. (2002). Nonresponse in the second wave of longitudinal household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey nonresponse (pp. 259–272). New York: Wiley.Google Scholar
  24. Lucas, R. E., & Donnellan, M. B. (2011). Personality development across the life span: Longitudinal analyses with a national sample from Germany. Journal of Personality and Social Psychology, 101, 847–861.CrossRefGoogle Scholar
  25. Lugtig, P. (2014). Panel attrition: Separating stayers, fast attriters, gradual attriters, and lurkers. Sociological Methods & Research, 43, 699–723.CrossRefGoogle Scholar
  26. Lynn, P. (2009a). Methods for longitudinal surveys. In P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 1–20). UK: Wiley.CrossRefGoogle Scholar
  27. Lynn, P. (2009b). Sample design for understanding society. Understanding Society Working Paper 2009-01. Colchester: University of Essex.Google Scholar
  28. Lynn, P. (2015). Targeted response inducement strategies on longitudinal surveys. In U. Engel, B. Jann, P. Lynn, A. Scherpenzeel, & P. Sturgis (Eds.), Improving survey methods: Lessons from recent research (pp. 322–338). New York: Routledge.Google Scholar
  29. Lynn, P. (2016). Targeted appeals for participation in letters to panel survey members. Public Opinion Quarterly, 80(3), 771–782.CrossRefGoogle Scholar
  30. Lynn, P. (2018). Tackling panel attrition. In D. L. Vannette & J. A. Krosnick (Eds.), The Palgrave Handbook of survey research (pp. 143–153). London: Palgrave.CrossRefGoogle Scholar
  31. Lynn, P., Nandi, A., Parutis, V., & Platt, L. (2017). Design and implementation of a high quality probability sample of immigrants and ethnic minorities: Lessons learnt, Understanding Society Working Paper Series, 2017-11.Google Scholar
  32. Nicoletti, C., & Buck, N. (2004). Explaining interviewee contact and co-operation in the British and German household panels. In M. Ehling & U. Rendtel (Eds.), Harmonisation of panel surveys and data quality (pp. 143–166). Wiesbaden: Statistisches Bundesamt.Google Scholar
  33. Nicoletti, C., & Peracchi, F. (2005). Survey response and survey characteristics: Microlevel evidence from the European Community Household Panel. Journal of the Royal Statistical Society, Series A, 168, 763–781.CrossRefGoogle Scholar
  34. Radler, B. T., & Ryff, C. D. (2010). Who participates? Accounting for longitudinal retention in the MIDUS National Study of Health and Well-Being. Journal of Aging and Health, 22, 307–331.CrossRefGoogle Scholar
  35. Richter, D., Körtner, J., & Sassenroth, D. (2014). Personality has minor effects on panel attrition. Journal of Research in Personality, 53, 31–35.CrossRefGoogle Scholar
  36. Sassenroth, D. (2013). The impact of personality on participation decision in surveys: A contribution to the discussion on unit nonresponse. Berlin: Springer.CrossRefGoogle Scholar
  37. Satherley, N., Milojev, P., Greaves, L. M., Huang, Y., Osborne, D., Bulbulia, J., et al. (2015). Demographic and psychological predictors of panel attrition: Evidence from the New Zealand Attitudes and Values Study. PLoS ONE, 10(3), e0121950. Scholar
  38. Schoeni, R., Stafford, F., McGonagle, K., & Andreski, P. (2013). Response rates in National Panel Surveys. The Annals of the American Academy of Political and Social Science, 645, 60–87.CrossRefGoogle Scholar
  39. Schouten, B., Calinescu, M., & Luiten, A. (2013). Optimizing quality of response through adaptive survey designs. Survey Methodology, 39(1), 29–58.Google Scholar
  40. Sikkel, D., & Hoogendoorn, A. (2008). Panel surveys. In E. De Leeuw, J. Hox, & D. A. Dillman (Eds.), International handbook of survey methodology (pp. 479–499). New York: Lawrence Erlbaum Associates.Google Scholar
  41. Smith, S. J., Cigdem, M., Ong, R., & Wood, G. (2017). Wellbeing at the edges of ownership. Environment and Planning A, 49, 1080–1098.CrossRefGoogle Scholar
  42. Specht, J., Egloff, B., & Schmukle, S. C. (2011). Stability and change of personality across the life course: The impact of age and major life events on mean-level and rank-order stability of the Big Five. Journal of Personality and Social Psychology, 101, 862–882.CrossRefGoogle Scholar
  43. Thomas, D., Frankenberg, E., & Smith, J. P. (2001). Lost but not forgotten: Attrition and follow up in the Indonesia Family Life Survey. Journal of Human Resources, 36, 556–592.CrossRefGoogle Scholar
  44. Tourangeau, R. (2003). The challenge of rising nonresponserates. In Recurring Surveys: Issues and Opportunities, Report to the National Science Foundation based on a Workshop held on March 28–29.Google Scholar
  45. Uhrig, N. (2008). The nature and causes of attrition in the British Household Panel Survey. ISER Working Paper 5/2008.Google Scholar
  46. UNESCO. (2012). International Standard Classification of Education ISCED 2011. Montreal: UNESCO Institute for Statistics.Google Scholar
  47. Wagner, J. (2008). Adaptive survey design to reduce non-response bias, Ph.D. Thesis. University of Michigan.Google Scholar
  48. Watson, N., & Wooden, M. (2009). Identifying factors affecting longitudinal survey response. In P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 157–182). Chichester: Wiley.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Management, Economics and Quantitative MethodsUniversity of BergamoBergamoItaly

Personalised recommendations