Skip to main content
Log in

Panel Conditioning in Longitudinal Studies: Evidence From Labor Force Items in the Current Population Survey

  • Published:
Demography

Abstract

Does participating in a longitudinal survey affect respondents’ answers to subsequent questions about their labor force characteristics? In this article, we investigate the magnitude of panel conditioning or time-in-survey biases for key labor force questions in the monthly Current Population Survey (CPS). Using linked CPS records for household heads first interviewed between January 2007 and June 2010, our analyses are based on strategic within-person comparisons across survey months and between-person comparisons across CPS rotation groups. We find considerable evidence for panel conditioning effects in the CPS. Panel conditioning downwardly biases the CPS-based unemployment rate, mainly by leading people to remove themselves from its denominator. Across surveys, CPS respondents (claim to) leave the labor force in greater numbers than otherwise equivalent respondents who are participating in the CPS for the first time. The results cannot be attributed to panel attrition or mode effects. We discuss implications for CPS-based research and policy as well as for survey methodology more broadly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. For example, by our count, of the 909 articles published in Demography between 1990 and 2010, at least 75 (or 8.2 %) of them featured original analyses of CPS data; another 61 (or 6.7 %) utilized CPS data in some other capacity.

  2. For a more comprehensive review of this literature, see Warren and Halpern-Manners (forthcoming).

  3. For instance, Chandon et al. (2004) showed that randomly selected customers of an online grocer who were asked about their intentions to make future purchases were substantially more likely than members of a randomly selected control group to make subsequent purchases from that grocer.

  4. These resemble similar propositions developed by Cantor (2008), Waterton and Lievesley (1989), and Bailar (1989).

  5. It is important to distinguish panel conditioning from social desirability biases. The latter may lead people to underreport unemployment, but it will lead to consistent levels of underreporting across CPS surveys—and thus would not bias estimates of change over time. Panel conditioning, in contrast, would lead to underreporting of unemployment on follow-up, but not baseline, waves of the CPS—and thus would bias estimates of within-person change over time.

  6. This may also explain why time-in-sample biases appear to be more pronounced in the CPS than in the SIPP (McCormick et al. 1992; National Research Council 2009; Pennell and Lepkowski 1992). Whereas CPS respondents are interviewed each month, SIPP respondents are interviewed every four months.

  7. In fact, survey length is not appreciably affected by whether respondents indicate that they are employed, unemployed, or out of the labor force. However, because they do not observe the counterfactual pathway through the CPS interview, respondents may believe that an alternate response may have led to fewer questions. This suspicion may be enough to lead some respondents to provide different (and inaccurate) responses in subsequent months.

  8. In this and subsequent comparisons, observations are weighted by the cross-sectional survey weight in the “origin” month.

  9. In the absence of panel conditioning, we would certainly not expect unemployment rates for people in MIS 1 to be exactly equal to those for people in MIS 2 in any particular calendar month; sometimes they will be higher, and sometimes they will be lower. However, in the absence of panel conditioning, we would expect the difference to be positive as often as it is negative. As shown in Fig. 1, the unemployment rate is higher for those in MIS 2 in 29 of 41 (71 %) possible comparisons. If we take this to be a binomial process (where 1 means that MIS1 – MIS2 is negative, and 0 means that MIS1 – MIS2 is positive), and if we take Pr(MIS1 – MIS2 < 0) to be .5, then the probability of getting 29 or more negative values of 41 trials is about .005. This suggests that our findings are not the result of sampling error.

  10. In an effort to account for panel attrition effects, in this supplementary analysis, we have used a form of poststratification weighting to cause the two groups of household heads to be identical with respect to their distributions of age, race/ethnicity, sex, marital status, region of residence, urbanicity, and nativity status; this procedure is detailed in Warren and Halpern-Manners (forthcoming).

  11. For the sake of space, we do not present analogous graphs of percentage out of the labor force because of disability or retirement for respondents interviewed by phone. Results for these variables, which are entirely consistent with the results shown in Figs. 3 and 4, are available upon request.

  12. It is important to note that panel conditioning in the CPS leads to bias in estimated levels of unemployment, but does not lead to systematic bias in short- or long-term trends in that rate.

References

  • Anderson, B. A., Silver, B. D., & Abramson, P. R. (1988). The effects of race of the interviewer on measures of electoral participation by blacks in SRC national election studies. Public Opinion Quarterly, 52, 53–83.

    Article  Google Scholar 

  • Bailar, B. A. (1975). Effects of rotation group bias on estimates from panel surveys. Journal of the American Statistical Association, 70(349), 23–30.

    Article  Google Scholar 

  • Bailar, B. A. (1989). Information needs, surveys, and measurement errors. In D. Kasprzyk, G. J. Duncan, G. Kalton, & M. P. Singh (Eds.), Panel surveys (pp. 1–24). New York: Wiley.

    Google Scholar 

  • Bartels, L. M. (1999). Panel effects in the American national election studies. Political Analysis, 8, 1–20.

    Article  Google Scholar 

  • Battaglia, M. P., Zell, E., & Ching, P. (1996). Can participating in a panel sample introduce bias into trend estimates? (National Immunization Survey Working Paper). Washington, DC: National Cenrter for Health Statistics. Retrieved from http://www.cdc.gov/nis/pdfs/estimation_weighting/battaglia1996b.pdf

  • Borle, S., Dholakia, U. M., Singh, S. S., & Westbrook, R. A. (2007). The impact of survey participation on subsequent customer behavior: an empirical investigation. Marketing Science, 26, 711–726.

    Article  Google Scholar 

  • Bridge, R. G., Reeder, L. G., Kanouse, D., Kinder, D. R., Nagy, V. T., & Judd, C. M. (1977). Interviewing changes attitudes—Sometimes. Public Opinion Quarterly, 41, 56–64.

    Article  Google Scholar 

  • Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally.

    Google Scholar 

  • Cantor, D. (2008). A review and summary of studies on panel conditioning. In S. Menard (Ed.), Handbook of longitudinal research: Design, measurement, and analysis (pp. 123–138). Burlington, MA: Academic Press.

    Google Scholar 

  • Chandon, P., Morwitz, V. G., & Reinartz, W. J. (2004). The short- and long-term effects of measuring intent to repurchase. Journal of Consumer Research, 31, 566–572.

    Article  Google Scholar 

  • Chandon, P., Morwitz, V. G., & Reinartz, W. J. (2005). Do intentions really predict behavior? Self-generated validity effects in survey research. Journal of Marketing, 69(2), 1–14.

    Article  Google Scholar 

  • Clausen, A. R. (1968). Response validity: Vote report. Public Opinion Quarterly, 32, 588–606.

    Article  Google Scholar 

  • Das, M., Toepoel, V., & van Soest, A. (2007). Can I use a panel? Panel conditioning and attrition bias in panel surveys (CentER Discussion Paper Series No. 2007-56). Tilburg, The Netherlands: Tilburg University CentER.

  • Das, M., Toepoel, V., & van Soest, A. (2011). Nonparametric tests of panel conditioning and attrition bias in panel surveys. Sociological Methods & Research, 40, 32–56.

    Article  Google Scholar 

  • De Amici, D., Klersy, C., Ramajoli, F., Brustia, L., & Politi, P. (2000). Impact of the Hawthorne effect in a longitudinal clinical study: The case of anesthesia. Controlled Clinical Trials, 21, 103–114.

    Article  Google Scholar 

  • Dholakia, U. M., & Morwitz, V. G. (2002). The scope and persistence of mere-measurement effects: Evidence from a field study of customer satisfaction measurement. Journal of Consumer Research, 29, 159–167.

    Article  Google Scholar 

  • Duan, N., Alegria, M., Canino, G., McGuire, T. G., & Takeuchi, D. (2007). Survey conditioning in self-reported mental health service use: Randomized comparison of alternative instrument formats. Health Services Research, 42, 890–907.

    Article  Google Scholar 

  • Fazio, R. H. (1989). On the power and functionality of attitudes: The role of attitude accessibility. In A. R. Pratkanis, S. J. Breckler, & A. G. Greenwald (Eds.), Attitude structure and function (pp. 153–179). Hillsdale, NJ: Lawrence Erlbaum Associates.

  • Fazio, R. H., Sanbonmatsu, D. M., Powell, M. C., & Kardes, E. R. (1986). On the automatic activation of attitudes. Journal of Personality and Social Psychology, 50, 229–238.

    Article  Google Scholar 

  • Feldman, J. M., & Lynch, J. G. (1988). Self-generated validity and other effects of measurement on belief, attitude, intention, and behavior. Journal of Applied Psychology, 73, 421–435.

    Article  Google Scholar 

  • Fitzsimons, G. J., & Moore, S. G. (2008). Should we ask our children about sex, drugs and rock & roll? Potentially harmful effects of asking questions about risky behaviors. Journal of Consumer Psychology, 18, 82–95.

    Article  Google Scholar 

  • Fitzsimons, G. J., Nunes, J. C., & Williams, P. (2007). License to sin: The liberating role of reporting expectations. Journal of Consumer Research, 34(1), 22–31.

    Article  Google Scholar 

  • Fitzsimons, G. J., & Williams, P. (2000). Asking questions can change choice behavior: Does it do so automatically or effortfully? Journal of Experimental Psychology-Applied, 6, 195–206.

    Article  Google Scholar 

  • Godin, G., Sheeran, P., Conner, M., & Germain, M. (2008). Asking questions changes behavior: Mere measurement effects on frequency of blood donation. Health Psychology, 27, 179–184.

    Article  Google Scholar 

  • Granberg, D., & Holmberg, S. (1992). The Hawthorne effect in election studies—The impact of survey participation on voting. British Journal of Political Science, 22, 240–247.

    Article  Google Scholar 

  • Greenwald, A. G., Carnot, C. G., Beach, R., & Young, B. (1987). Increasing voting-behavior by asking people if they expect to vote. Journal of Applied Psychology, 72, 315–318.

    Article  Google Scholar 

  • Hansen, M. H., Hurwitz, W. N., Nisselson, H., & Steinberg, J. (1955). The redesign of the census current population survey. Journal of the American Statistical Association, 50, 701–719.

    Article  Google Scholar 

  • Hernandez, L. M., Durch, J. S., Blazer, D. G., & Hoverman, I. V. (1999). Gulf War veterans: Measuring health. Committee on Measuring the Health of Gulf War Veterans, Division of Health Promotion and Disease Prevention Institute of Medicine. Washington, DC: National Academies Press.

  • Holt, D. (1989). Panel conditioning: Discussion. In D. Kasprzyk, G. J. Duncan, G. Kalton, & M. P. Singh (Eds.), Panel surveys (pp. 340–347). New York: Wiley.

    Google Scholar 

  • Janiszewski, C., & Chandon, E. (2007). Transfer-appropriate processing response fluency, and the mere measurement effect. Journal of Marketing Research, 44, 309–323.

    Article  Google Scholar 

  • Jensen, P. S., Watanabe, H. K., & Richters, J. E. (1999). Who’s up first? Testing for order effects in structured interviews using a counterbalanced experimental design. Journal of Abnormal Child Psychology, 27, 439–445.

    Article  Google Scholar 

  • Kalton, G., & Citro, C. F. (2000). Panel surveys: Adding the fourth dimension. In D. Rose (Ed.), Researching social and economic change (pp. 36–53). London, UK: Routledge.

    Google Scholar 

  • Kessler, R. C., Wittchen, H.-U., Abelson, J . A., McGonagle, K., Schwarz, N., Kendler, K. S., . . . Zhao, S. (1988). Methodological studies of the Composite International Diagnostic Interview (CIDI) in the US National Comorbidity Survey (NCS). International Journal of Methods in Psychiatric Research, 7, 33–55.

  • Kraut, R. E., & McConahay, J. B. (1973). How being interviewed affects voting: An experiment. Public Opinion Quarterly, 37, 398–406.

    Article  Google Scholar 

  • Landsberger, H. A. (1958). Hawthorne revisited. Ithaca, NY: Cornell University.

    Google Scholar 

  • Levav, J., & Fitzsimons, G. J. (2006). When questions change behavior—The role of ease of representation. Psychological Science, 17, 207–213.

    Article  Google Scholar 

  • Lucas, C. P., Fisher, P., Piacentini, J., Zhang, H., Jensen, P. S., Shaffer, D., . . . Canino, G. (1999). Features of interview questions associated with attenuation of symptom reports. Journal of Abnormal Child Psychology, 27, 429–437.

  • Mathiowetz, N. A., & Lair, T. J. (1994). Getting better? Change or error in the measurement of functional limitations. Journal of Economic and Social Measurement, 20, 237–262.

    Google Scholar 

  • McCormick, M. K., Butler, D. M., & Singhm, R. P. (1992). Investigating time in sample effect for the survey of income and program participation. Presented at Proceedings of the Survey Research Methods Section of the American Statistical Association.

  • Meurs, H., Wissen, L. V., & Visser, J. (1989). Measurement bases in panel data. Transportation, 16, 175–194.

    Article  Google Scholar 

  • Millar, M. G., & Tesser, A. (1986). Thought-induced attitude-change: The effects of schema structure and commitment. Journal of Personality and Social Psychology, 51, 259–269.

    Article  Google Scholar 

  • Morwitz, V. G. (2005). The effect of survey measurement on respondent behaviour. Applied Stochastic Models in Business and Industry, 21, 451–455.

    Article  Google Scholar 

  • Nancarrow, C., & Cartwright, T. (2007). Online access panels and tracking research: The conditioning issue. International Journal of Market Research, 49, 573–594.

    Google Scholar 

  • National Research Council. (2009). In C. F. Citro & J. K. Scholz (Eds.), Reengineering the Survey of Income and Program Participation. Panel on the Census Bureau’s Reengineered Survey of Income and Program Participation. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academies Press.

  • Olson, J. A. (1999). Linkages with data from the social security administrative records in the health and retirement study. Social Security Bulletin, 62, 73–85.

    Google Scholar 

  • O’Sullivan, I., Orbell, S., Rakow, T., & Parker, R. (2004). Prospective research in health service settings: Health psychology, science and the “Hawthorne” effect. Journal of Health Psychology, 9, 355–359.

    Article  Google Scholar 

  • Pennell, S. G., & Lepkowski, J. N. (1992). Panel conditioning effects in the Survey of Income and Program Participation. Presented at Proceedings of the Survey Research Methods Section of the American Statistical Association.

  • Shack-Marquez, J. (1986). Effects of repeated interviewing on estimation of labor-force status. Journal of Economic and Social Measurement, 14, 379–398.

    Google Scholar 

  • Sherman, S. J. (1980). On the self-erasing nature of errors of prediction. Journal of Personality and Social Psychology, 39, 211–221.

    Article  Google Scholar 

  • Shockey, J. W. (1988). Adjusting for response error in panel surveys: A latent class approach. Sociological Methods & Research, 17, 65–92.

    Article  Google Scholar 

  • Simmons, C. J., Bickart, B. A., & Lynch, J. G. (1993). Capturing and creating public opinion in survey research. Journal of Consumer Research, 20, 316–329.

    Article  Google Scholar 

  • Solomon, R. L. (1949). An extension of control group design. Psychological Bulletin, 46, 137–150.

    Article  Google Scholar 

  • Solon, G. (1986). Effects of rotation group bias on estimation of unemployment. Journal of Business & Economic Statistics, 4(1), 105–109.

    Google Scholar 

  • Spangenberg, E. R., Greenwald, A. G., & Sprott, D. E. (2008). Will you read this article’s abstract? Theories of the question-behavior effect. Journal of Consumer Psychology, 18, 102–106.

    Article  Google Scholar 

  • Spangenberg, E. R., Sprott, D. E., Grohmann, B., & Smith, R. J. (2003). Mass-communicated prediction requests: Practical application and a cognitive dissonance explanation for self-prophecy. Journal of Marketing, 67(3), 47–62.

    Article  Google Scholar 

  • Sturgis, P., Allum, N., & Brunton-Smith, I. (2009). Attitudes over time: The psychology of panel conditioning. In P. Lynn (Ed.), Methodology of longitudinal surveys (pp. 113–126). New York: Wiley.

    Chapter  Google Scholar 

  • Tesser, A. (1978). Self-generated attitude change. In L. Berkowitz (Ed.), Advances in experiemental social psychology (pp. 289–338). New York: Academic Press.

    Google Scholar 

  • Toh, R. S., Lee, E., & Hu, M. Y. (2006). Social desirability bias in diary panels is evident in panelists’ behavioral frequency. Psychological Reports, 99, 322–334.

    Google Scholar 

  • Torche, F., Warren, J. R., Halpern-Manners, A., & Valenzuela, E. (2012). Panel conditioning in a longitudinal study of Chilean adolescents' substance use: Evidence from an experiment. Social Forces, 90, 891–918.

    Article  Google Scholar 

  • Traugott, M. W., & Katosh, J. P. (1979). Response validity in surveys of voting-behavior. Public Opinion Quarterly, 43, 359–377.

    Article  Google Scholar 

  • U.S. Bureau of Labor Statistics. (2006). Design and methodology: Current population survey (Technical Paper 66). Washington, DC: U.S. Department of Labor, Census Bureau.

  • Voogt, R. J. J., & Van Kempen, H. (2002). Nonresponse bias and stimulus effects in the Dutch National Election Study. Quality & Quantity, 36, 325–345.

    Article  Google Scholar 

  • Wang, K., Cantor, D., & Safir, A. (2000). Panel conditioning in a random digit dial survey. Presented at Proceedings of the Survey Research Methods Section of the American Statistical Association.

  • Warren, J. R., & Halpern-Manners A. (Forthcoming). Panel conditioning in longitudinal social science surveys. Sociological Methods & Research.

  • Waterton, J., & Lievesley, D. (1989). Evidence of conditioning effects in the British Social Attitudes Panel Survey. In D. Kasprzyk, G. J. Duncan, G. Kalton, & M. P. Singh (Eds.), Panel surveys (pp. 319–339). New York: Wiley.

    Google Scholar 

  • Williams, P., Block, L. G., & Fitzsimons, G. J. (2006). Simply asking questions about health behaviors increases both healthy and unhealthy behaviors. Social Influence, 1, 117–127.

    Article  Google Scholar 

  • Williams, P., Fitzsimons, G. J., & Block, L. G. (2004). When consumers do not recognize “benign” intention questions as persuasion attempts. Journal of Consumer Research, 31, 540–550.

    Article  Google Scholar 

  • Williams, W. H., & Mallows, C. L. (1970). Systematic biases in panel surveys due to differential nonresponse. Journal of the American Statistical Association, 65, 1338–1349.

    Article  Google Scholar 

  • Yalch, R. F. (1976). Pre-election interview effects on voter turnout. Public Opinion Quarterly, 40, 331–336.

    Article  Google Scholar 

Download references

Acknowledgments

Order of authorship is alphabetical to reflect equal contributions by the authors. This article was inspired by a conversation with Michael Hout, and was originally prepared for presentation at the April 2010 annual meetings of the Population Association of America. The National Science Foundation (SES-0647710) and the University of Minnesota’s Life Course Center, Department of Sociology, College of Liberal Arts, and Minnesota Population Center have provided support for this project. We warmly thank Eric Grodsky, Ross Macmillan, Gregory Weyland, anonymous reviewers, and workshop participants at the University of Minnesota, the University of Texas, the University of Wisconsin-Madison, and New York University for their constructive criticism and comments. Finally, we thank Anne Polivka, Dorinda Allard, and Steve Miller at the U.S. Bureau of Labor Statistics for their helpful feedback. However, all errors or omissions are the authors’ responsibility.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew Halpern-Manners.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Halpern-Manners, A., Warren, J.R. Panel Conditioning in Longitudinal Studies: Evidence From Labor Force Items in the Current Population Survey. Demography 49, 1499–1519 (2012). https://doi.org/10.1007/s13524-012-0124-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13524-012-0124-x

Keywords

Navigation