Advertisement

Getting Started: What, Where, Why

  • Saiyidi Mat Roni
  • Margaret Kristin Merga
  • Julia Elizabeth Morris
Chapter

Abstract

It is important to consider the many shades of grey when planning for research in the educational setting. In education there are many factors that impact on research outcomes: the classroom setting, teacher, students, past educational experiences, family experiences and values, and school culture (to name a few). All of these variables need to be considered when planning and conducting research. So, where do you start in planning to conduct quantitative research? This chapter will provide a basic introduction into planning for educational research using quantitative methods, from thinking about what quantitative methods are about, considerations for where it is appropriate to use quantitative methods and why they may be useful in achieving your research aims.

Keywords

Quantitative research design Mixed method research design Sampling methods Hypotheses 

References

  1. Brooks, R., Maguire, M., & te Riele, K. (2014). Ethics and education research. London: Sage.CrossRefGoogle Scholar
  2. Creswell, J. (2014). Educational research: Planning, conducting and evaluating quantitative and qualitative research (4th ed.). Essex: Pearson.Google Scholar
  3. Creswell, J., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research. Los Angeles, CA: Sage.Google Scholar
  4. De Vaus, D. A. (2002). Surveys in social research (5th ed.). Crows Nest: Allen & Unwin.Google Scholar
  5. Duckworth, A. L., & Yeager, D. S. (2015). Measurement matters. Educational Researcher, 44(4), 237–251.  https://doi.org/10.3102/0013189x15584327.CrossRefGoogle Scholar
  6. Enochs, L. G., Smith, P. L., & Huinker, D. (2000). Establishing factorial validity of the mathematics teaching efficacy beliefs instrument. School Science and Mathematics, 100(4), 194–202.CrossRefGoogle Scholar
  7. Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). London: Sage.Google Scholar
  8. Hattie, J. A., & Brown, G. T. L. (2011). Assessment and evaluation. In C. M. Rubie-Davies (Ed.), Educational psychology: Concepts, research and challenges. New York: Routledge.Google Scholar
  9. Institute of Medicine of the National Academies. (2015). Psychological testing in the service of disability determination. Washington, DC: The National Academies Press.Google Scholar
  10. Lipsey, M. W. (1990). Design sensitivity: Statistical power for experimental research. Newbury Park: Sage.Google Scholar
  11. Morris, J. E. (2018). The development of a student engagement instrument for the responding strand in visual arts. The Australian Educational Researcher.  https://doi.org/10.1007/s13384-018-0296-5.CrossRefGoogle Scholar
  12. Morris, J. E., Lummis, G., McKinnon, D. H., & Heyworth, J. (2017). Measuring preservice teacher self-efficacy in music and visual arts: Validation of an amended science teacher efficacy belief instrument. Teaching and Teacher Education, 64(1), 1–11.  https://doi.org/10.1016/j.tate.2017.01.014.CrossRefGoogle Scholar
  13. Norris, C. M., Morris, J. E., & Lummis, G. W. (2018). Preservice teachers’ self-efficacy to teach primary science based on ‘science learner’ typology. International Journal of Science Education.  https://doi.org/10.1080/09500693.2018.1528645.CrossRefGoogle Scholar
  14. Timperley, H. (2009). Using assessment data for improving teaching practice. Paper presented at the ACER research conference 2009: Assessment and student learning: Collecting, interpreting and using data to inform teaching, Perth, Australia.Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Saiyidi Mat Roni
    • 1
  • Margaret Kristin Merga
    • 2
  • Julia Elizabeth Morris
    • 3
  1. 1.School of Business and LawEdith Cowan UniversityJoondalupAustralia
  2. 2.School of EducationEdith Cowan UniversityPerthAustralia
  3. 3.School of EducationEdith Cowan UniversityMount LawleyAustralia

Personalised recommendations