An Exploration into How Physical Activity Data-Recording Devices Could be Used in Computer-Supported Data Investigations

  • Victor R. Lee
  • Maneksha DuMont


There is a great potential opportunity to use portable physical activity monitoring devices as data collection tools for educational purposes. Using one such device, we designed and implemented a weeklong workshop with high school students to test the utility of such technology. During that intervention, students performed data investigations of physical activity that culminated in the design and implementation of their own studies. In this paper, we explore some of the mathematical thinking that took place through a series of vignettes of a pair of students engaged in analyzing some of their own activity data. A personal connection to the data appeared to aid these students in recognizing their own errors, and ultimately helped them move from a point-based analytical approach for making sense of the data to an aggregate one. From our observations of this designed learning experience, we conclude that physical activity data recording devices can afford students the opportunity to reason with personally relevant data in meaningful ways.


Physical activity data Sensors Probeware Visualizations Statistics Averages Mobile technology 


  1. Bakker, A., & Gravemeijer, K. (2006). An historical phenomenology of mean and median. Educational Studies in Mathematics, 62(2), 149–168.CrossRefGoogle Scholar
  2. Bassett, D. R., & Strath, S. J. (2002). Use of pedometers to assess physical activity. In G. J. Welk (Ed.), Physical activity assessments for health-related research (pp. 163–177). Champaign, IL: Human Kinetics Publishers.Google Scholar
  3. Bers, M. U. (2008). Blocks to robots: Learning with technology in the early childhood classroom. New York, NY: Teachers College Press.Google Scholar
  4. Bravata, D. M., Smith-Spangler, C., Sundaram, V., Gienger, A. L., Lin, N., Lewis, R., et al. (2007). Using pedometers to increase physical activity and improve health. Journal of the American Medical Association, 228(19), 2296–2304.CrossRefGoogle Scholar
  5. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2(2), 141–178.CrossRefGoogle Scholar
  6. Cai, J. (1998). Exporing students’ conceptual understanding of the averaging algorithm. School Science and Mathematics, 98, 93–98.CrossRefGoogle Scholar
  7. Cai, J., Lo, J., & Watanabe, T. (2002). Intended treatments of arithmetic average in U.S. and Asian school mathematics textbooks. School Science and Mathematics, 102(8), 391–403.CrossRefGoogle Scholar
  8. Collins, A. (1992). Toward a design science of education. In T. O’Shea & E. Scnalon (Eds.), New directions in educational technology (Vol. 96, pp. 15–22). Berlin: Springer Verlag.Google Scholar
  9. Dynastream Innovations Inc. (2003). SpeedMax white paper (No. D00000187). Cochrana, AB, Canada.Google Scholar
  10. Finzer, W. (2005). Fathom dynamic data software. Emeryville, CA: Key Curriculum Press.Google Scholar
  11. Hancock, C., Kaput, J. J., & Goldsmith, L. T. (1992). Authentic inquiry with data: Critical barriers to classroom implementation. Educational Psychologist, 27(3), 337–364.CrossRefGoogle Scholar
  12. Hug, B., & McNeill, K. L. (2008). Use of first-hand and second-hand data in science: Does data type influence classroom conversations? International Journal of Science Education, 30(13), 1725–1751.CrossRefGoogle Scholar
  13. Jacobbe, T. (2008). Elementary school teachers’ understanding of the mean and median. In C. Batanero, G. Burrill, C. Reading, & A. Rossman (Eds.), Proceedings of the ICMI study 18 and 2008 IASE round table conference. Mexico: Monterey.Google Scholar
  14. Janz, K. F. (2002). Use of heart rate monitors to assess physical activity. In G. J. Welk (Ed.), Physical activity assessments for health-related research (pp. 143–161). Champaign, IL: Human Kinetics Publishers.Google Scholar
  15. Kanter, D. E. (2010). Doing the project and learning the content: Designing project-based science curricula for meaningful understanding. Science Education, 94(3), 525–551.Google Scholar
  16. Karvonen, J., Chwalbinska-Moneta, J., & Saynajakangas, S. (1984). Comparison of heart rates measured by ECG and microcomputer. Physican and Sportsmedicine, 12, 65–69.Google Scholar
  17. Konold, C. (2007). Designing a data analysis tool for learners. In M. C. Lovett & P. Shah (Eds.), Thinking with data (pp. 267–292). New York: Lawrence Erlbaum.Google Scholar
  18. Konold, C., & Miller, C. (2005). TinkerPlots. Dynamic data exploration. Statistics software for middle school curricula. Emeryville, CA: Key Curriculum Press.Google Scholar
  19. Konold, C., & Pollatsek, A. (2002). Data analysis as the search for signals in noisy processes. Journal for Research in Mathematics Education, 33(4), 259–289.CrossRefGoogle Scholar
  20. Konold, C., Robinson, A., Khalil, K., Pollatsek, A., Well, A. D., Wing, R., et al. (2002). Students’ use of modal clumps to summarize data. Paper presented at the Sixth International Conference on Teaching Statistics, Cape Town, South Africa.Google Scholar
  21. Krajcik, J. S., Czerniak, C. M., & Berger, C. (1999). Teaching children science: A project-based approach. New York: McGraw Hill College Press.Google Scholar
  22. Leger, L., & Thivierege, M. (1988). Heart rate monitors: Validity, stability and functionality. Physican and Sportsmedicince, 16, 143–151.Google Scholar
  23. Lehrer, R., Giles, N. D., & Schauble, L. (2002). Children’s work with data. In R. Lehrer & L. Schauble (Eds.), Investigating real data in the classroom: Expanding children’s understanding of math and science (pp. 1–26). New York: Teachers College Press.Google Scholar
  24. Lehrer, R., Kim, M., & Schauble, L. (2007). Supporting the development of conceptions of statistics by engaging students in measuring and modeling variability. International Journal of Computers for Mathematical Learning, 12, 195–216.CrossRefGoogle Scholar
  25. Lehrer, R., & Romberg, T. A. (1996). Exploring children’s data modeling. Cognition and Instruction, 14(1), 69–108.CrossRefGoogle Scholar
  26. Linn, M. C., & Hsi, S. (2000). Computers, teachers, peers: Science learning partners. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  27. Linn, M. C., Layman, J. W., & Nachmias, R. (1987). Cognitive consequences of microcomputer-based laboratories: Graphing skills development. Contemporary Educational Psychology, 12(3), 244–253.CrossRefGoogle Scholar
  28. Marx, R. W., Blumenfeld, P., Kracjik, J., Fishman, B. J., Soloway, E., Geier, R., et al. (2004). Inquiry-based science in the middle grades: Assessment of learning in urban systemic reforms. Journal of Research in Science Teaching, 41(10), 1063–1080.CrossRefGoogle Scholar
  29. McClusky, M. (2009). The Nike experiment: How the shoe giant unleashed the power of personal metrics. Wired, 17, 81–91.Google Scholar
  30. Metcalf, S. J., & Tinker, R. (2004). Probeware and handhelds in elementary and middle school science. Journal of Science Education and Technology, 13(1), 43–49.CrossRefGoogle Scholar
  31. Mevarech, Z. (1983). A deep structure model of students’ statistical misconceptions. Educational Studies in Mathematics, 14(4), 415–429.CrossRefGoogle Scholar
  32. Mokros, J., & Russell, S. J. (1995). Children’s concepts of average and representativeness. Journal for Research in Mathematics Education, 26(1), 20–39.CrossRefGoogle Scholar
  33. Mokros, J., & Tinker, R. (1987). The impact of microcomputer-based labs on children’s ability to interpret graphs. Journal of Research in Science Teaching, 24(4), 369–383.CrossRefGoogle Scholar
  34. Nemirovsky, R. (1994). On ways of symbolizing: The case of Laura and the velocity sign. Journal of Mathematical Behavior, 13(4), 389–422.CrossRefGoogle Scholar
  35. Nemirovsky, R. (2002). On guessing the essential thing. In K. Gravemeijer, R. Lehrer, & L. Verschaffel (Eds.), Symbolizing, modeling, and tool use in mathematics education (pp. 233–255). Dordrecht, Netherlands: Kluwer Academic Publishers.Google Scholar
  36. Nemirovsky, R., Tierney, C., & Wright, T. (1998). Body motion and graphing. Cognition and Instruction, 16(2), 119–172.CrossRefGoogle Scholar
  37. Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. New York, NY: Basic Books.Google Scholar
  38. Pollatsek, A., Lima, S., & Well, A. D. (1981). Concept or computation: Students’ understanding of the mean. Educational Studies in Mathematics, 12(2), 191–204.CrossRefGoogle Scholar
  39. Redish, E. F., Saul, J. M., & Steinberg, R. N. (1997). On the effectiveness of active-engagement microcomputer-based laboratories. American Journal of Physics, 65(1), 45–54.CrossRefGoogle Scholar
  40. Resnick, M., Berg, R., & Eisenberg, M. (2000). Beyond black boxes: Bringing transparency and aesthetics back to scientific investigation. Journal of the Learning Sciences, 9(1), 7–30.CrossRefGoogle Scholar
  41. Resnick, M., Martin, F., Sargent, R., & Silverman, B. (1996). Programmable bricks: Toys to think with. IBM Systems Journal, 35(Nos. 3 & 4), 443–452.Google Scholar
  42. Russell, S. J., Tierney, C., Mokros, J., & Economopoulos, K. (2004). Investigations in number, data, and space. Glenview, IL: Scott Foresman.Google Scholar
  43. Schank, R. C. (1999). Dynamic memory revisited (2nd ed.). Cambridge, New York: Cambridge University Press.CrossRefGoogle Scholar
  44. Sipitakiat, A., Blikstein, P., & Cavallo, D. P. (2004). GoGo board: Augmenting programmable bricks for economically challenged audiences. In Y. Kafai, W. Sandoval, N. Enyedy, A. Nixon, & F. Herrera (Eds.), Sixth international conference of the learning sciences (pp. 481–488). Santa Monica, CA: Lawrence Erlbaum Associates.Google Scholar
  45. Strauss, S., & Bichler, E. (1988). The development of children’s concepts of the arithmetic average. Journal for Research in Mathematics Education, 19(1), 64–80.CrossRefGoogle Scholar
  46. Tinker, R. (2000). A history of probeware. Retrieved 2 Jan 2010, from
  47. Watson, J. M., & Moritz, J. B. (2000). The longitudinal development of understanding of average. Mathematical Thinking and Learning, 2(1&2), 11–50.Google Scholar
  48. Wolf, G. (2010). The data-driven life. New York Times Magazine. Retrieved from
  49. Wu, H.-K., & Kracjik, J. S. (2006). Inscription practices in two inquiry-based classrooms: A case study of seventh graders’ use of data tables and graphs. Journal of Research in Science Teaching, 43(1), 63–95.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  1. 1.Department of Instructional Technology and Learning SciencesUtah State UniversityLoganUSA

Personalised recommendations