Book cover

Student-Centered Learning Environments in Higher Education Classrooms pp 105–167 Cite as

Empirical Education Research on the Effectiveness and Quality of Learning and Instruction

  • Sabine Hoidn 2  
  • First Online: 30 October 2016

1629 Accesses

This chapter focuses on common deeper-level quality dimensions and features of instruction referring to both the quality of learning and teaching processes, and the quality of classroom interaction and climate. Process-outcome research, research on effective self-regulated learning, and research on the effectiveness of problem-based learning, mainly conducted in the context of higher education, are reviewed. Apart from that, state-of-the-art empirical instructional research on quality features of teaching and learning, mainly conducted in school environments and in the context of teacher education, is discussed. As a result, a conceptual framework is outlined as a starting point and point of reference for the subsequent empirical study comprising common design principles and instructional quality dimensions and features that have to be considered when designing powerful student-centered learning environments.

  • Teacher Education
  • Pedagogical Content Knowledge
  • Mastery Goal
  • Instructional Quality
  • Teacher Enthusiasm

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Bibliography

Abrami, P. C., Cohen, P. A., & d’Apollonia, S. (1988). Implementation problems in meta-analysis. Review of Educational Research, 58 , 151–179.

Article   Google Scholar  

Abrami, P. C., d’Apollonia, S., & Rosenfield, S. (2007). The dimensionality of student ratings of instruction: What we know and what we do not. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 385–445). Dordrecht, The Netherlands: Springer.

Chapter   Google Scholar  

Aebli, H. (1983). Zwölf Grundformen des Lehrens [Twelve basic forms of teaching]. Stuttgart: Klett-Cotta.

Google Scholar  

Albanese, M. A., & Mitchell, S. (1993). Problem-based learning: A review of literature on its outcomes and implementation issues. Academic Medicine, 68 (1), 52–81.

Alexander, R. J. (2008). Towards dialogic teaching: Rethinking classroom talk (4th ed.). York, UK: Dialogos.

Anderson, L. W. (2004). Increasing teacher effectiveness (2nd ed.). Paris: UNESCO, International Institute for Educational Planning. Retrieved January 20, 2016, from http://unesdoc.unesco.org/images/0013/001376/137629e.pdf

Armstrong, S. J., & Fukami, C. V. (Eds.). (2009). The Sage handbook of management learning, education and development . London: Sage.

Bandura, A. (1997). Self-efficacy: The exercise of control . New York: Freeman.

Barrows, H. S. (1996). Problem-based learning in medicine and beyond: A brief overview. In L. Wilkerson & W. H. Gijselaers (Eds.), Bringing problem-based learning to higher education: Theory and practice (pp. 3–12). San Francisco, CA: Jossey-Bass.

Barrows, H. S. (2002). Is it truly possible to have such a thing as PBL? Distance Education, 23 (1), 119–122.

Barrows, H. S., & Tamblyn, R. (1980). Problem-based learning: An approach to medical education . New York: Springer.

Baumert, J., & Kunter, M. (2013a). The COACTIV model of teachers’ professional competence. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 25–48). New York: Springer.

Baumert, J., & Kunter, M. (2013b). The effect of content knowledge and pedagogical content knowledge on instructional quality and student achievement. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 175–205). New York: Springer.

Bembenutty, H. (Ed.). (2011). Self-regulated learning (New directions for teaching and learning, No. 126). San Francisco, CA: Jossey-Bass.

Bembenutty, H., Cleary, T. J., & Kitsantas, A. (Eds.). (2013). Applications of self-regulated learning across diverse disciplines: A tribute to Barry J. Zimmerman . Charlotte, NC: Information Age Publishing.

Berk, R. A. (2005). Survey of 12 strategies to measure teaching effectiveness. International Journal of Teaching and Learning in Higher Education, 17 (1), 48–62.

Berkson, L. (1993). Problem-based learning: Have the expectations been met? Academic Medicine, 68 (10), S79–S88.

Berliner, D. C. (2001). Learning about and learning from expert teachers. International Journal of Educational Research, 35 (5), 463–482.

Biggs, J. B. (1999). Teaching for quality learning at university . Buckingham: Open University Press.

Biggs, J. B. (2012). What the student does: Teaching for enhanced learning. Higher Education Research and Development, 31 (1), 39–55.

Blömeke, S. (2014). Forschung zur Lehrerbildung im internationalen Vergleich. In E. Terhart, H. Bennewitz, & M. Rothland (Eds.), Handbuch der Forschung zum Lehrerberuf (pp. 441–467). Münster: Waxmann.

Boekaerts, M. (1999). Self-regulated learning: Where we are today. International Journal of Educational Research, 31 (6), 445–457.

Boekaerts, M., Pintrich, P. R., & Zeidner, M. (Eds.). (2000). Handbook of self-regulation . San Diego, CA: Academic Press.

Bracey, P. (2010). Self-regulated learning vs. self-directed learning: Twins or just friends? In J. Sanchez & K. Zhang (Eds.), Proceedings of e-learn: World conference on e-learning in corporate, government, healthcare, and higher education 2010 (pp. 1600–1607). Chesapeake, VA: Association for the Advancement of Computing in Education.

BrckaLorenz, A., Cole, E., Kinzie, J., & Ribera, A. (2011). Examining effective faculty practice: Teaching clarity and student engagement. Paper presented at the annual meeting of the American Educational Research Association in New Orleans. Retrieved January 20, 2016, from http://cpr.iub.edu/uploads/AERA%202011%20Teaching%20Clarity%20Paper.pdf

Bromme, R. (1997). Kompetenzen, Funktionen und unterrichtliches Handeln des Lehrers. In F. E. Weinert (Ed.), Enzyklopädie der Psychologie: Pädagogische Psychologie, Vol. 3, Psychologie des Unterrichts und der Schule (pp. 177–212). Göttingen: Hogrefe.

Brophy, J. (2000). Teaching (Educational practices series-1). Retrieved January 20, 2016, from http://www.ibe.unesco.org/publications/educationalpracticesseriespdf/prac01e.pdf

Brophy, J. (2006). Observational research on generic aspects of classroom teaching. In P. A. Alexander & P. Winne (Eds.), Handbook of educational psychology (pp. 755–780). Mahwah, NJ: Erlbaum.

Brophy, J., & Good, T. (1986). Teacher behavior and student achievement. In M. C. Wittrock (Ed.), Handbook of research on teaching (pp. 328–375). New York: Macmillan.

Campbell, J., & Mayer, R. E. (2009). Questioning as an instructional method: Does it affect learning from lectures? Applied Cognitive Psychology, 23 (6), 747–759.

Cashin, W. E. (1995). Student ratings of teaching: The research revisited (IDEA paper, No. 32). Manhattan, KS: Kansas State University, Center for Faculty Evaluation and Development.

Centra, J. A. (1993). Reflective faculty evaluation: Enhancing teaching and determining faculty effectiveness . San Francisco, CA: Jossey-Bass.

Chism, N. V. N. (2004). Characteristics of effective teaching in higher education: Between definitional despair and certainty. Journal on Excellence in College Teaching, 15 (3), 5–36.

Cohen, P. A. (1980). Effectiveness of student-rating feedback for improving college instruction: A meta-analysis. Research in Higher Education, 13 , 321–341.

Cohen, P. A. (1981). Student ratings of instruction and student achievement: A meta analysis of multisection validity studies. Review of Educational Research, 51 (3), 281–309.

Colliver, J. A. (2000). Effectiveness of problem-based learning curricula: Research and theory. Academic Medicine, 75 (3), 259–266.

Cornelius-White, J. (2007). Learner-centered teacher-student relationships are effective: A meta-analysis. Review of Educational Research, 77 (1), 113–143.

Darling-Hammond, L., & Bransford, J. (Eds.). (2005). Preparing teachers for a changing world: What teachers should learn and be able to do . San Francisco, CA: Jossey-Bass.

De Corte, E. (2004). Mainstreams and perspectives in research on learning (mathematics) from instruction. Applied Psychology: An International Review, 53 (2), 279–310.

Deci, E. L., & Ryan, R. M. (Eds.). (2002). Handbook of self-determination research . Rochester, NY: University of Rochester Press.

Devlin, M. (2006). Challenging accepted wisdom about the place of conceptions of teaching in university teaching improvement. International Journal of Teaching and Learning in Higher Education, 18 (2), 112–118.

Dochy, F., Segers, M., Van den Bossche, P., & Gijbels, D. (2003). Effects of problem-based learning: A meta-analysis. Learning and Instruction, 13 , 533–568.

Dubs, R. (2007). Selbstgesteuertes Lernen – ein Beitrag für den Unterrichtsalltag. In A. Gastager, T. Hascher, & H. Schwetz (Eds.), Pädagogisches Handeln: Balancing zwischen Theorie und Praxis. Beiträge zur Wirksamkeitsforschung in pädagogisch-psychologischem Kontext (pp. 7–18). Landau: Verlag Empirische Pädagogik.

Dubs, R. (2009). Lehrerverhalten. Ein Beitrag zur Interaktion von Lehrenden und Lernenden im Unterricht (2. Auflage). Zürich: SKV.

Engle, R. A. (2006). Framing interactions to foster generative learning: A situative account of transfer in a community of learners classroom. Journal of the Learning Sciences, 15(4) , 451–498.

Engle, R. A., Nguyen, P. D., & Mendelson, A. (2011). The influence of framing on transfer: Initial evidence from a tutoring experiment. Instructional Science, 39 (5), 603–628.

English, M. C., & Kitsantas, A. (2013). Supporting student self-regulated learning in problem- and project-based learning. Interdisciplinary Journal of Problem-based Learning, 7 (2), 128–150.

European University Association (EUA). (2010). Trends 2010: A decade of change in European higher education (by Andrée Sursock & Hanne Smidt). Retrieved January 20, 2016, from http://www.eua.be/Libraries/Publications_homepage_list/Trends2010.sflb.ashx

Feldman, K. A. (1989). The association between student ratings of specific instructional dimensions and student achievement: Refining and extending the synthesis of data from multisection validity studies. Research in Higher Education, 30 (6), 583–645.

Feldman, K. A. (1997). Identifying exemplary teachers and teaching: Evidence from student ratings. In R. P. Perry & J. C. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 368–395). New York: Agathon Press.

Feldman, K. A. (2007). Identifying exemplary teachers and teaching: Evidence from student ratings. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 93–129). Dordrecht, The Netherlands: Springer.

Fend, H. (1998). Qualität im Bildungswesen. Schulforschung zu Systembedingungen, Schulprofilen und Lehrerleistung [Quality of education. Educational research on systems of education, school profiles and teacher effectiveness]. Weinheim: Juventa.

Fenstermacher, G. D., & Richardson, V. (2000). On making determinations of quality in teaching. A paper prepared at the request of the Board on International Comparative Studies in Education of the National Academy of Sciences. Retrieved January 20, 2016, from http://www-personal.umich.edu/~gfenster/teaqual14ss.PDF

Frisby, B. N., & Martin, M. M. (2010). Instructor-student and student-instructor rapport in the classroom. Communication Education, 59 , 146–164.

Gerbig-Calcagni, I. (2009). Wie aufmerksam sind Studierende in Vorlesungen und wie viel können sie behalten? Dissertation. Weingarten: Pädagogische Hochschule Weingarten.

Gibbs, G. (1992a). Assessing more students . Oxford: Oxford Centre for Staff Development.

Gibbs, G. (1992b). Assessing student-centred courses . Oxford: Oxford Centre for Staff Development.

Gijbels, D., Dochy, F., Van den Bossche, P., & Segers, M. (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research, 75 (1), 27–61.

Greeno, J. G. (2006). Theoretical and practical advances through research on learning. In J. L. Green, G. Camilli, & P. B. Elmore (Eds.), Handbook of complementary methods in education research (pp. 795–822). Washington, DC/Mahwah, NJ: American Educational Research Association/Lawrence Erlbaum Associates.

Greeno, J. G. (2011). A situative perspective on cognition and learning in interaction. In T. Koschmann (Ed.), Theories of learning and studies of instruction (pp. 41–72). New York: Springer.

Greenwald, A. G., & Gillmore, G. M. (1997). No pain, no gain? The importance of measuring course workload in student ratings of instruction. Journal of Educational Psychology, 89 , 743–751.

Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66 , 64–74.

Hamre, B. K., & Pianta, R. C. (2010). Classroom environments and development processes. Conceptualization and measurement. In J. L. Meece & J. S. Eccles (Eds.), Handbook of research on schools, schooling, and human development (pp. 25–41). New York: Routledge.

Hativa, N., Barak, R., & Simhi, E. (2001). Exemplary university teachers: Knowledge and beliefs regarding effective teaching dimensions and strategies. Journal of Higher Education, 72 (6), 699–729.

Hattie, J. (2009). Visible learning. A synthesis of over 800 meta-analyses relating to achievement . London: Routledge.

Hattie, J. (2012). Visible learning for teachers. Maximizing impact on learning . New York: Routledge.

Hattie, J., & Marsh, H. W. (1996). The relationship between research and teaching: A meta-analysis. Review of Educational Research, 66 , 507–542.

Helmke, A. (2009). Unterrichtsqualität und Lehrerprofessionalität. Diagnose, Evaluation und Verbesserung des Unterrichts . Seelze-Velber: Kallmeyer/Klett.

Hill, H. C., Rowan, B., & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for teaching on student achievement. American Educational Research Journal, 42 (2), 371–406.

Hines, C. V., Cruickshank, D. R., & Kennedy, J. J. (1985). Teacher clarity and its relationship to student achievement and satisfaction. American Educational Research Journal, 22 , 87–99.

Hmelo, C. E. (1998). Problem-based learning: Effects on the early acquisition of cognitive skill in medicine. Journal of the Learning Sciences, 7 (2), 173–208.

Hmelo-Silver, C. E., Gotterer, G. S., & Bransford, J. D. (1997). A theory-driven approach to assessing the cognitive effects of PBL. Instructional Science, 25 (6), 387–408.

Hoidn, S. (2010). Lernkompetenzen an Hochschulen fördern. Dissertation an der Universität St. Gallen. Wiesbaden: VS Verlag für Sozialwissenschaften.

Hoidn, S., & Kärkkäinen, K. (2014). Promoting skills for innovation in higher education: A literature review on the effectiveness of problem-based learning and of teaching behaviours (OECD education working papers, No. 100). Paris: OECD Publishing. Retrieved January 20, 2016, from http://www.oecd-ilibrary.org/content/workingpaper/5k3tsj67l226-en

Hugener, I., Pauli, C., Reusser, K., Lipowsky, F., Rakoczy, K., & Klieme, E. (2009). Teaching patterns and learning quality in Swiss and German mathematics lessons. Learning and Instruction, 19 (1), 66–78.

Karp, D. A., & Yoels, W. C. (1976). The college classroom: Some observations on the meanings of student participation. Sociology and Social Research, 60 , 421–439.

Keith, N., & Frese, M. (2008). Effectiveness of error management training: A meta-analysis. Journal of Applied Psychology, 93 (1), 59–69.

Kember, D. (1997). A reconceptualisation of the research into university academics’ conceptions of teaching. Learning and Instruction, 7 (3), 255–275.

Kenney, J. L., & Banerjee, P. (2011). “Would someone say something, please?” Increasing student participation in college classrooms. Journal on Excellence in College Teaching, 22 (4), 57–81.

Kistner, S., Rakoczy, K., Otto, B., Dignath-van Ewijk, C., Büttner, G., & Klieme, E. (2010). Promotion of self-regulated learning in classrooms: Investigating frequency, quality, and consequences for student performance. Metacognition Learning, 5 , 157–171.

Kleickmann, T., & Anders, Y. (2013). Learning at university. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 321–332). New York: Springer.

Klieme, E., Pauli, C., & Reusser, K. (2009). The Pythagoras study: Investigating effects of teaching and learning in Swiss and German mathematics classrooms. In T. Janik & T. Seidel (Eds.), The power of video studies in investigating teaching and learning in the classroom (pp. 137–160). Münster: Waxmann.

Klieme, E., & Rakoczy, K. (2008). Empirische Unterrichtsforschung und Fachdidaktik. Zeitschrift für Pädagogik, 54 (2), 222–237.

Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119 (2), 254–284.

Koh, G. C., Khoo, H. E., Wong, M. L., & Koh, D. (2008). The effects of problem-based learning during medical school on physician competency: A systematic review. Canadian Medical Association Journal, 178 (1), 34–41.

Kotthoff, H.-G., & Terhart, E. (2013). ‘New’ solutions to ‘old’ problems? Recent reforms in teacher education in Germany. Revista Española de Educación Comparada, 22 , 73–92.

Kounin, J. S. (1970). Discipline and group management in classrooms . New York: Holt, Rinehart & Winston.

Kunter, M., & Baumert, J. (2013). The COACTIV research program on teachers’ professional competence: Summary and discussion. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 345–368). New York: Springer.

Kunter, M., Klusmann, U., Baumert, J., Richter, D., Voss, T., & Hachfeld, A. (2013). Professional competence of teachers: Effects on quality and student development. Journal of Educational Psychology, 105 (3), 805–820.

Kunter, M., & Voss, T. (2013). The model of instructional quality in COACTIV: A multicriteria analysis. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 97–124). New York: Springer.

Kyndt, E., Raes, E., Lismont, B., Timmers, F., Cascallar, E., & Dochy, F. (2013). A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educational Research Review, 10 , 133–149.

Leutwyler, B., & Maag Merki, K. (2009). School effects on students’ self-regulated learning. A multivariate analysis of the relationship between individual perceptions of school processes and cognitive, metacognitive, and motivational dimensions of self-regulated learning. Journal for Educational Research Online, 1 (1), 197–223.

Lipowsky, F., Rakoczy, K., Pauli, C., Drollinger-Vetter, B., Klieme, E., & Reusser, K. (2009). Quality of geometry instruction and its short-term impact on students’ understanding of the Pythagorean theorem. Learning and Instruction, 19 , 527–537.

Lo, C. C. (2010). Student learning and student satisfaction in an interactive classroom. Journal of General Education, 59 (4), 238–263.

Loyens, S. M. M., Kirschner, P. A., & Paas, F. (2012). Problem-based learning. In K. R. Harris, S. Graham, & T. Urdan (Eds.), APA educational psychology handbook: Vol. 3. Application to learning and teaching (pp. 403–425). Washington, DC: American Psychological Association.

MacKinnon, M. M. (1999). CORE elements of student motivation in problem-based learning. In M. Theall (Ed.), Motivation from within: Approaches for encouraging faculty and students to excel (pp. 49–58). San Francisco, CA: Jossey-Bass.

Mandl, H., & Friedrich, H. F. (Eds.). (2006). Handbuch Lernstrategien [Handbook learning strategies]. Göttingen: Hogrefe.

Marsh, H. W. (1987). Students’ evaluations of university teaching: Research findings, methodological issues, and directions for further research. International Journal of Educational Research, 11 , 253–388.

Marsh, H. W. (2007). Students’ evaluations of university teaching: Dimensionality, reliability, validity, potential biases and usefulness. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 319–383). Dordrecht, The Netherlands: Springer.

Marzano, R. J., Pickering, D. J., & Heflebower, T. (2011). The highly engaged classroom . Bloomington, IN: Marzano Research Laboratory.

Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction. American Psychologist, 59 (1), 14–19.

Mayer, R. E. (2009). Constructivism as a theory of learning versus constructivism as a prescription for instruction. In S. Tobias & T. M. Duffy (Eds.), Constructivist instruction: Success or failure? (pp. 184–200). New York: Routledge.

McDonald, F., & Elias, P. (1976). The effects of teaching performance on pupil learning. Vol. I: Beginning teacher evaluation study, phase 2 . Princeton, NJ: Educational Testing Service.

McKeachie, W. J. (2007). Good teaching makes a difference – And we know what it is. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 457–474). Dordrecht, The Netherlands: Springer.

Mennin, S., Gordan, P., Majoor, G., & Al Shazali Osman, H. (2003). Position paper on problem-based learning. Education for Health, 16 (1), 98–113.

Moore, G. T., Block, S. D., Style, C. B., & Mitchell, R. (1994). The influence of the new pathway curriculum on Harvard medical students. Academic Medicine, 69 (12), 983–989.

Murray, H. G. (1983). Low-inference classroom teaching behaviors in relation to six measures of college teaching effectiveness. In J. G. Donald (Ed.), Proceedings of the conference on the evaluation and improvement of university teaching: The Canadian experience (pp. 43–73). Montreal: Centre for Teaching and Learning Services, McGill University.

Murray, H. G. (1985). Classroom teaching behaviors related to college teaching effectiveness. In J. G. Donald & A. M. Sullivan (Eds.), Using research to improve teaching (pp. 21–34). San Francisco, CA: Jossey-Bass.

Murray, H. G. (1997). Effective teaching behaviors in the college classroom. In R. P. Perry & J. C. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 171–204). New York: Agathon Press.

Murray, H. G. (2007a). Low-inference behaviors and college teaching effectiveness: Recent developments and controversies. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 145–183). Dordrecht, The Netherlands: Springer.

Murray, H. G. (2007b). Research on low-inference teaching behaviors: An update. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 184–200). Dordrecht, The Netherlands: Springer.

Mustapha, S. M. (2010). Understanding classroom interaction: A case study of international students’ classroom participation at one of the colleges in Malaysia. International Journal for the Advancement of Science & Arts, 1 (2), 91–99.

Nandi, P. L., Chan, J. N. F., Chan, C. P. K., Chan, P., & Chan, L. P. K. (2000). Undergraduate medical education: Comparison of problem-based learning and conventional teaching. Hong Kong Medical Journal, 6 (3), 301–306.

National Research Council of the USA (NRC). (2000). How people learn: Brain, mind, experience, and school (expanded edition). Washington, DC: National Academy Press.

Newman, M. (2003). A pilot systematic review and meta-analysis on the effectiveness of problem-based learning. Newcastle: Learning & Teaching Subject Network – 01. Retrieved January 20, 2016, from https://www.heacademy.ac.uk/sites/default/files/SR2_effectiveness_of_PBL_0.pdf

Niemiec, C. P., & Ryan, R. M. (2009). Autonomy, competence, and relatedness in the classroom: Applying self-determination theory to educational practice. Theory and Research in Education, 7 , 133–144.

Norman, G. R., & Schmidt, H. G. (2000). Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Medical Education, 34 , 721–728.

Nunn, C. E. (1996). Discussion in the college classroom: Triangulating observational and survey results. Journal of Higher Education, 67 (3), 243–266.

OECD. (2013). Innovative learning environments . Paris: OECD Publishing, Educational Research and Innovation.

Book   Google Scholar  

Ory, J. C., & Ryan, K. (2001). How do student ratings measure up to a new validity framework? In P. C. Abrami, M. Theall, & L. A. Mets (Eds.), The student ratings debate: Are they valid? How can we best use them? (pp. 27–44). San Francisco, CA: Jossey-Bass.

Pascarella, E. T. (2006). How college affects students: Ten directions for future research. Journal of College Student Development, 47 , 508–520.

Pascarella, E. T., Salisbury, M. H., & Blaich, C. (2011). Exposure to effective instruction and college student persistence: A multi-institutional replication and extension. Journal of College Student Development, 52 (1), 4–19.

Pascarella, E. T., Seifert, T. A., & Whitt, E. J. (2008). Effective instruction and college student persistence: Some new evidence (pp. 55–70). San Francisco, CA: Jossey-Bass.

Pascarella, E. T., & Terenzini, P. (1991). How college affects students: Findings and insights from twenty years of research . San Francisco, CA: Jossey-Bass.

Pascarella, E. T., & Terenzini, P. (2005). How college affects students: A third decade of research (Vol. 2). San Francisco, CA: Jossey-Bass.

Patel, V. L., Groen, G. J., & Norman, G. R. (1993). Reasoning and instruction in medical curricula. Cognition and Instruction, 10 , 335–378.

Pauli, C. (2010). Lehrerexpertise, Unterrichtsqualität und Lernerfolg. Exemplarische Beiträge videobasierter Unterrichtsforschung. Kumulative Habilitationsschrift. Universität Zürich: Institut für Erziehungswissenschaft.

Pauli, C., Drollinger-Vetter, B., Hugener, I., & Lipowsky, F. (2008). Kognitive Aktivierung im Mathematikunterricht. Zeitschrift für Pädagogische Psychologie, 22 (2), 127–133.

Pauli, C., & Reusser, K. (2006). Von international vergleichenden Video Surveys zur videobasierten Unterrichtsforschung und -entwicklung. Zeitschrift für Pädagogik, 52 (6), 774–798.

Pauli, C., & Reusser, K. (2011). Expertise in Swiss mathematics instruction. In Y. Li & G. Kaiser (Eds.), Expertise in mathematics instruction. An international perspective (pp. 85–107). New York: Springer.

Pauli, C., Reusser, K., & Grob, U. (2007). Teaching for understanding and/or self-regulated learning? A video-based analysis of reform-oriented mathematics instruction in Switzerland. International Journal of Educational Research, 46 , 294–305.

Pea, R. (2004). The social and technological dimensions of scaffolding and related theoretical concepts for learning, education, and human activity. The Journal of the Learning Sciences, 13 (39), 423–451.

Pekrun, R., Hall, N. C., Goetz, T., & Perry, R. P. (2014). Boredom and academic achievement: Testing a model of reciprocal causation. Journal of Educational Psychology, 106(3) , 696–710.

Perry, R. P., & Smart, J. C. (Eds.). (1997). Effective teaching in higher education: Research and practice . New York: Agathon.

Perry, R. P., & Smart, J. C. (Eds.). (2007). The scholarship of teaching and learning in higher education: An evidence-based perspective . Dordrecht, The Netherlands: Springer.

Pintrich, P. R. (2000). The role of goal orientation in self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 451–502). San Diego, CA: Academic Press.

Pintrich, P. R., & Zusho, A. (2002). The development of academic self-regulation: The role of cognitive and motivational factors. In A. Wigfield & J. S. Eccles (Eds.), Development of achievement motivation (pp. 249–284). San Diego, CA: Academic Press.

Pintrich, P. R., & Zusho, A. (2007). Student motivation and self-regulated learning in the college classroom. In R. P. Perry & J. C. Smart (Eds.), The scholarship of teaching and learning in higher education: An evidence-based perspective (pp. 731–810). Dordrecht, The Netherlands: Springer.

Praetorius, A.-K., Pauli, C., Reusser, K., Rakoczy, K., & Klieme, E. (2014). One lesson is all you need? Stability of instructional quality across lessons. Learning and Instruction, 31 (1), 2–12.

Prosser, M., & Trigwell, K. (1998). Teaching for learning in higher education . Buckingham: Open University Press.

Prosser, M., & Trigwell, K. (1999). Understanding learning and teaching: The experience in higher education . London: Society for Research in Higher Education & Open University Press.

Ramsden, P. (2003). Learning to teach in higher education (2nd ed.). London: Routledge.

Ravitz, J. (2009). Summarizing findings and looking ahead to a new generation of PBL research. The Interdisciplinary Journal of Problem-based Learning, 3 (1), 4–11.

Reusser, K. (2009). Unterricht. In S. Andresen, R. Casale, T. Gabriel, R. Horlacher, S. Larcher Klee, & J. Oelkers (Eds.), Handwörterbuch Erziehungswissenschaft (pp. 881–896). Weinheim: Beltz.

Reusser, K., & Pauli, C. (2013). Verständnisorientierung in Mathematikstunden erfassen. Ergebnisse eines methodenintegrativen Ansatzes. Zeitschrift für Pädagogik, 59 (3), 308–335.

Reyes, M. R., Brackett, M. A., Rivers, S. E., White, M., & Salovey, P. (2012). Classroom emotional climate, student engagement, and academic achievement. Journal of Educational Psychology, 104 , 700–712.

Rocca, K. A. (2010). Student participation in the college classroom: an extended multidisciplinary literature review. Communication Education, 59 (2), 185–213.

Roehling, P. V., Vander Kooi, T. L., Dykema, S., Quisenberry, B., & Vandlen, C. (2011). Engaging the millennial generation in class discussions. College Teaching, 59 , 1–6.

Rosenshine, B. (2009). The empirical support for direct instruction. In S. Tobias & T. M. Duffy (Eds.), Constructivist instruction: Success or failure? (pp. 201–220). New York: Routledge.

Ryan, R. M., & Deci, E. L. (2002). An overview of self-determination-theory: An organismic-dialectical perspective. In E. L. Deci & R. M. Ryan (Eds.), Handbook of self-determination research (pp. 3–33). Rochester, NY: University of Rochester Press.

Sanson-Fisher, R. W., & Lynagh, M. C. (2005). Problem-based learning: A dissemination success story? The Medical Journal of Australia, 183 (5), 258–260.

Scheerens, J., & Bosker, R. (1997). The foundations of educational effectiveness . Oxford, UK: Elsevier.

Schmidt, H. G., Cohen-Schotanus, J., & Arends, L. (2009). Impact of problem-based, active learning on graduation rates of ten generations of Dutch medical students. Medical Education, 43 , 211–218.

Schmidt, H. G., Loyens, S. M. M., Van Gog, T., & Paas, F. (2007). Problem-based learning is compatible with human cognitive architecture: Commentary on Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42 (2), 91–97.

Schmidt, H. G., & Moust, J. H. C. (2000). Factors affecting small-group tutorial learning: A review of research. In D. H. Evensen & C. E. Hmelo (Eds.), Problem-based learning: A research perspective on learning interactions (pp. 19–52). Hillsdale, NJ: Erlbaum.

Schmidt, H. G., Rotgans, J. I., & Yew, E. H. J. (2011). The process of problem-based learning: What works and why. Medical Education, 45 , 792–806.

Schmidt, H. G., Van der Molen, H. T., Te Winkel, W. W. R., & Wijnen, W. H. F. W. (2009). Constructivist, problem-based learning does work: A meta-analysis of curricular comparisons involving a single medical school. Educational Psychologist, 44 (4), 227–249.

Schonwetter, D., Perry, R. P., & Struthers, C. W. (1994). Students’ perceptions of control and success in the college classroom: Affects and achievement in different instructional conditions. Journal of Experimental Education, 61 , 227–246.

Schunk, D. H., & Zimmerman, B. J. (Eds.). (2007). Motivation and self-regulated learning: Theory, research, and applications . Mahwah, NJ: Lawrence Erlbaum.

Schwartz, D. L., Lindgren, R., & Lewis, S. (2009). Constructivism in an age of non-constructivist assessments. In S. Tobias & T. M. Duffy (Eds.), Constructivist instruction: Success or failure? (pp. 34–61). New York: Routledge.

Seidel, T., & Shavelson, R. (2007). Teaching effectiveness research in the past decade: The role of theory and research design in disentangling metaanalysis results. Review of Educational Research, 77 (4), 454–499.

Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57 , 1–22.

Slavin, R. E. (2009). Cooperative learning. In G. McCulloch & D. Crook (Eds.), International encyclopaedia of education . Abington, UK: Routledge.

Smits, P. B. A., Verbeek, J. H. A. M., & De Buisonjé, C. D. (2002). Problem based learning in continuing medical education: A review of controlled evaluation studies. British Medical Journal, 324 (7330), 153–156.

Stanovich, K. E. (1980). Toward an interactive-compensatory model of individual differences in the development of reading fluency. Reading Research Quarterly, 16 , 32–71.

Strobel, J., & Van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. The Interdisciplinary Journal of Problem-based Learning, 3 (1), 44–58.

Sutton-Brady, C., & Stegemann, N. (2010). Assessing methods to improve class participation. Proceedings of the European College Teaching & Learning ETLC Conference 2010, Dublin, Ireland, 10th June 2010.

Tatto, M. T., Schwille, J., Senk, S. L., Ingvarson, L., Rowley, G., Peck, R., et al. (2012). Policy, practice, and readiness to teach primary and secondary mathematics in 17 countries: Findings from the IEA Teacher Education and Development Study in Mathematics (TEDS-M) . Amsterdam: IEA.

Terhart, E. (Ed.). (2014). Die Hattie-Studie in der Diskussion. Probleme sichtbar machen . Seelze: Kallmeyer, Klett.

Terhart, E., Bennewitz, H., & Rothland, M. (Hrsg.). (2014). Handbuch der Forschung zum Lehrerberuf (2., überarbeitete und erweiterte Auflage). Münster: Waxmann.

Tobias, S., & Duffy, T. M. (Eds.). (2009a). Constructivist instruction: Success or failure? New York: Routledge.

Tobias, S., & Duffy, T. M. (2009b). The success or failure of constructivist instruction. An introduction. In S. Tobias & T. M. Duffy (Eds.), Constructivist instruction: Success or failure? (pp. 3–10). New York: Routledge.

Tricot, A., & Sweller, J. (2014). Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26 , 265–283.

Trigwell, K. (2001). Judging university teaching. International Journal for Academic Development, 6 (1), 65–73.

Tuckman, B. W., & Kennedy, G. J. (2011). Teaching learning strategies to increase success of first-term college students. The Journal of Experimental Education, 79 , 478–504.

Vallerand, R. J., & Ratelle, C. F. (2002). Intrinsic and extrinsic motivation: A hierarchicalmodel. In E. L. Deci & R. M. Ryan (Eds.), Handbook of self-determination research (pp. 37–64). Rochester: University of Rochester Press.

Van de Pol, J. (2012). Scaffolding in teacher-student interaction: Exploring, measuring, promoting and evaluating scaffolding. Doctoral dissertation at the University of Amsterdam. Retrieved January 20, 2016, from http://dare.uva.nl/record/426432

Van den Berg, M. N., & Hofman, W. H. A. (2005). Student success in university education. A multi-measurement study into the impact of student and faculty factors on study progress. Higher Education, 50 (3), 413–446.

Vernon, D. T. A., & Blake, R. L. (1993). Does problem-based learning work? A meta-analysis of evaluative research. Academic Medicine, 68 , 550–563.

Voss, T., & Kunter, M. (2013). Teachers’ general pedagogical/psychological knowledge. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss, & M. Neubrand (Eds.), Cognitive activation in the mathematics classroom and professional competence of teachers. Results from the COACTIV project (pp. 207–228). New York: Springer.

Walker, A., & Leary, H. (2009). A problem based learning meta analysis: Differences across problem types, implementation types, disciplines, and assessment levels. The Interdisciplinary Journal of Problem-based Learning, 3 (1), 12–43.

Wang, M. C., Haertel, G. D., & Walberg, H. J. (1993). Toward a knowledge base for school learning. Review of Educational Research, 6 (3), 249–294.

Weaver, R. R., & Qi, J. (2005). Classroom organization and participation: College students’ perceptions. The Journal of Higher Education, 76 (5), 570–601.

Webb, N. G., & Obrycki Barrett, L. (2014). Student views of instructor-student rapport in the college classroom. Journal of the Scholarship of Teaching and Learning, 14 (2), 15–28.

Weimer, M. (1997). Exploring the implications: From research to practice. In R. P. Perry & J. C. Smart (Eds.), Effective teaching in higher education: Research and practice (pp. 411–435). New York: Agathon Press.

Weinert, F. E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Saganik (Eds.), Defining and selecting key competencies (pp. 45–65). Seattle, WA: Hogrefe & Huber.

Wijnia, A., Loyens, S. M. M., & Derous, E. (2011). Investigating effects of problem-based versus lecture-based learning environments on student motivation. Contemporary Educational Psychology, 36 (2), 101–113.

Wilson, J. H., & Ryan, R. G. (2013). Professor-student rapport scale: Six items predict student outcomes. Teaching of Psychology, 40 (2), 130–133.

Wimshurst, K., & Manning, M. (2013). Feed-forward assessment, exemplars and peer marking: Evidence of efficacy. Assessment and Evaluation in Higher Education, 38 (4), 451–465.

Wirth, J., & Leutner, D. (2008). Self-regulated learning as a competence: Implications of theoretical models for assessment methods. Journal of Psychology, 216 (2), 102–110.

Xian, H., & Madhavan, K. (2013). Building on and honoring forty years of PBL scholarship from Howard Barrows: A scientometric, large-scale data, and visualization-based analysis. Interdisciplinary Journal of Problem-based Learning, 7 (1), 132–156.

Yin, R. (2009). Case study research: Design and methods (4th ed.). Thousand Oaks, CA: Sage.

Zeegers, P. (2004). Student learning in higher education: A path analysis of academic achievement in science. Higher Education Research & Development, 23 (1), 35–56.

Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81 (3), 329–339.

Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). San Diego, CA: Academic Press.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41 (2), 64–70.

Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal, 45 (1), 166–183.

Zimmerman, B. J. (2013). From cognitive modeling to self-regulation: A social cognitive career path. Educational Psychologist, 48 (3), 135–147.

Zimmerman, B. J., & Schunk, D. H. (Eds.). (2001). Self-regulated learning and academic achievement: Theoretical perspectives (2nd ed.). Mahwah, NJ: Erlbaum.

Zimmerman, B. J., & Schunk, D. H. (Eds.). (2011). Handbook of self-regulation of learning and performance . New York: Routledge.

Zusho, A., & Edwards, K. (2011). Self-regulation and achievement goals in the college classroom. In H. Bembenutty (Ed.), Self-regulated learning (pp. 21–31). San Francisco, CA: Jossey-Bass.

Download references

Author information

Authors and affiliations.

St. Gallen, Switzerland

Sabine Hoidn

You can also search for this author in PubMed   Google Scholar

Copyright information

© 2017 The Author(s)

About this chapter

Cite this chapter.

Hoidn, S. (2017). Empirical Education Research on the Effectiveness and Quality of Learning and Instruction. In: Student-Centered Learning Environments in Higher Education Classrooms. Palgrave Macmillan, New York. https://doi.org/10.1057/978-1-349-94941-0_3

Download citation

DOI : https://doi.org/10.1057/978-1-349-94941-0_3

Published : 30 October 2016

Publisher Name : Palgrave Macmillan, New York

Print ISBN : 978-1-349-94940-3

Online ISBN : 978-1-349-94941-0

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Penn State University Libraries

Empirical research in the social sciences and education.

  • What is Empirical Research and How to Read It
  • Finding Empirical Research in Library Databases
  • Designing Empirical Research
  • Ethics, Cultural Responsiveness, and Anti-Racism in Research
  • Citing, Writing, and Presenting Your Work

Contact the Librarian at your campus for more help!

Ellysa Cahoy

Introduction: What is Empirical Research?

Empirical research is based on observed and measured phenomena and derives knowledge from actual experience rather than from theory or belief. 

How do you know if a study is empirical? Read the subheadings within the article, book, or report and look for a description of the research "methodology."  Ask yourself: Could I recreate this study and test these results?

Key characteristics to look for:

  • Specific research questions to be answered
  • Definition of the population, behavior, or   phenomena being studied
  • Description of the process used to study this population or phenomena, including selection criteria, controls, and testing instruments (such as surveys)

Another hint: some scholarly journals use a specific layout, called the "IMRaD" format, to communicate empirical research findings. Such articles typically have 4 components:

  • Introduction : sometimes called "literature review" -- what is currently known about the topic -- usually includes a theoretical framework and/or discussion of previous studies
  • Methodology: sometimes called "research design" -- how to recreate the study -- usually describes the population, research process, and analytical tools used in the present study
  • Results : sometimes called "findings" -- what was learned through the study -- usually appears as statistical data or as substantial quotations from research participants
  • Discussion : sometimes called "conclusion" or "implications" -- why the study is important -- usually describes how the research results influence professional practices or future studies

Reading and Evaluating Scholarly Materials

Reading research can be a challenge. However, the tutorials and videos below can help. They explain what scholarly articles look like, how to read them, and how to evaluate them:

  • CRAAP Checklist A frequently-used checklist that helps you examine the currency, relevance, authority, accuracy, and purpose of an information source.
  • IF I APPLY A newer model of evaluating sources which encourages you to think about your own biases as a reader, as well as concerns about the item you are reading.
  • Credo Video: How to Read Scholarly Materials (4 min.)
  • Credo Tutorial: How to Read Scholarly Materials
  • Credo Tutorial: Evaluating Information
  • Credo Video: Evaluating Statistics (4 min.)
  • Next: Finding Empirical Research in Library Databases >>
  • Last Updated: Feb 18, 2024 8:33 PM
  • URL: https://guides.libraries.psu.edu/emp

Open access

  • Published: 22 December 2022

A systematic review of high impact empirical studies in STEM education

  • Yeping Li 1 ,
  • Yu Xiao 1 ,
  • Ke Wang 2 ,
  • Nan Zhang 3 , 4 ,
  • Yali Pang 5 ,
  • Ruilin Wang 6 ,
  • Chunxia Qi 7 ,
  • Zhiqiang Yuan 8 ,
  • Jianxing Xu 9 ,
  • Sandra B. Nite 1 &
  • Jon R. Star 10  

International Journal of STEM Education volume  9 , Article number:  72 ( 2022 ) Cite this article

10k Accesses

13 Citations

33 Altmetric

Metrics details

The formation of an academic field is evidenced by many factors, including the growth of relevant research articles and the increasing impact of highly cited publications. Building upon recent scoping reviews of journal publications in STEM education, this study aimed to provide a systematic review of high impact empirical studies in STEM education to gain insights into the development of STEM education research paradigms. Through a search of the Web of Science core database, we identified the top 100 most-cited empirical studies focusing on STEM education that were published in journals from 2000 to 2021 and examined them in terms of various aspects, including the journals where they were published, disciplinary content coverage, research topics and methods, and authorship’s nationality/region and profession. The results show that STEM education continues to gain more exposure and varied disciplinary content with an increasing number of high impact empirical studies published in journals in various STEM disciplines. High impact research articles were mainly authored by researchers in the West, especially the United States, and indicate possible “hot” topics within the broader field of STEM education. Our analysis also revealed the increased participation and contributions from researchers in diverse fields who are working to formulate research agendas in STEM education and the nature of STEM education scholarship.

Introduction

Two recent reviews of research publications, the first examining articles in the International Journal of STEM Education (IJSTEM) and the second looking at an expanded scope of 36 journals, examined how scholarship in science, technology, engineering, and mathematics (STEM) education has developed over the years (Li et al., 2019 , 2020a ). Although these two reviews differed in multiple ways (e.g., the number of journals covered, the time period of article publications, and article selection), they shared the common purpose of providing an overview of the status and trends in STEM education research. The selection of journal publications in these two reviews thus emphasized the coverage and inclusion of all relevant publications but did not consider publication impact. Given that the development of a vibrant field depends not only on the number of research outputs and its growth over the years but also the existence and influence of some high impact research articles, here we aimed to identify and examine those high impact research publications in STEM education in this review.

Learning from existing reviews of STEM education research

Existing reviews of STEM education have provided valuable insights about STEM education scholarship development over the years. In addition to the two reviews mentioned above, there are many other research reviews on different aspects of STEM education. For example, Chomphuphra et al. ( 2019 ) reviewed 56 journal articles published from 2007 to 2017 covering three popular topics: innovation for STEM learning, professional development, and gender gap and career in STEM. They identified and selected these journal articles through searching the Scopus database and two additional journals in STEM education that were not indexed in Scopus at that time. Several other reviews have been conducted and published with a focus on specific topics, such as the assessment of the learning assistant model (Barrasso & Spilios, 2021 ), STEM education in early childhood (Wan et al., 2021 ), and research on individuals' STEM identity (Simpson & Bouhafa, 2020 ). All of these reviews helped in summarizing and synthesizing what we can learn from research on different topics related to STEM education.

Given the on-going rapid expansion of interest in STEM education, the number of research reviews in STEM education research has also been growing rapidly over the years. For example, there were only one or two research reviews published yearly in IJSTEM just a few years ago (Li, 2019 ). However, the situation started to change quickly over the past several years (Li & Xiao, 2022 ). Table 1 provides a summary list of research reviews published in IJSTEM in 2020 and 2021. The journal published a total of five research reviews in 2020 (8%, out of 59 publications), which then increased to seven in 2021 (12%, out of 59 publications).

Taking a closer look at these research reviews, we noticed that three reviews were conducted with a broad perspective to examine research and trends in STEM education (Li et al., 2020a , 2020b ) or STEAM (science, technology, engineering, arts, and mathematics) education (Marin-Marin et al., 2021 ). Relatively large numbers of publications/projects were reviewed in these studies to provide a general overview of research development and trends. The other nine reviews focused on research on specific topics or aspects in STEM education. These results suggest that, with the availability of a rapidly accumulating number of studies in STEM education, researchers have started to go beyond general research trends to examine and summarize research development on specific topics. Moreover, across these 12 reviews, researchers used many different approaches to search multiple data sources (often with specified search terms) to identify and select articles, including journal publications, research reports, conference papers, or dissertations. It appears that researchers have been creative in developing and using specific approaches to select and review publications that are pertinent to their topics. At the same time, however, none of these reviews were designed and conducted to identify and review high impact research articles that had notable influences on the development of STEM education scholarship.

The importance of examining high impact empirical research publications in STEM education

STEM education differs from many other fields, as STEM itself is not a discipline. There are diverse perspectives about the disciplinarity of STEM and STEM education (e.g., Erduran, 2020 ; Li et al., 2020a ; Takeuchi et al., 2020 ; Tytler, 2020 ). The complexity and ambiguity in viewing and examining STEM and STEM education presents challenges as well as opportunities for researchers to explore and specify what and how they do in ways different from and/or connected with traditional education in the individual disciplines of science, technology, engineering, and mathematics.

Although the field of STEM education is still in an early stage of its development, STEM education has experienced tremendous growth over the past decade. This field has evolved from traditional individual discipline-based education in STEM fields to multi- and interdisciplinary education in STEM. The development of STEM education has been supported by multiple factors, including research funding (Li et al., 2020b ) and the growth of research publications (Li et al., 2020a ). High impact publications play a very large role in the growth of the field, as they are read and cited frequently by others and serve to shape the development of scholarship in the field more than other publications.

Among high impact research publications, we can identify several different types of articles, including empirical studies, research reviews, and conceptual or theoretical papers. Research reviews and conceptual/theoretical papers are very valuable, as they synthesize existing research on a specific topic and/or provide new perspective(s) and direction(s), but they are typically not empirical studies. Review articles aim to provide a summary of the current state of the research in the field or on a particular topic, and they help readers to gain an overview about a topic, key issues and publications. Thus, they are more about what has been published in the literature about a topic and less about reporting new empirical evidence about a topic. Similarly, theoretical or conceptual papers tend to draw on existing research to advance theory or propose new perspectives. In contrast, empirical studies require the use and analysis of empirical data to provide empirical evidence. While reporting original research has been typical in empirical studies in education, these studies can also be secondary analyses of empirical data that test hypotheses not considered or addressed in previous studies. Empirical studies are generally published in academic, peer-reviewed journals and consist of distinct sections that reflect the stages in the research process. With the aim to gain insights about research development in STEM education, we thus decided to focus here on empirical studies in STEM education. Examining and reviewing high impact empirical research publications can help provide us a better understanding about emerging trends in STEM education in terms of research topics, methods, and possible directions in the future.

Considerations in identifying and selecting high impact empirical research publications

Publishing as a way of disseminating and sharing knowledge has many types of outlets, including journals, books, and conference proceedings. Different publishing outlets have different advantages in reaching out to readers. Researchers may search different data sources to identify and select publications to review, as indicated in Table 1 . At the same time, journal publications are commonly chosen and viewed as one of the most important outlets valued by the research community for knowledge dissemination and exchange. Specifically, there are two important advantages in terms of evaluating the quality and impact of journal publications over other formats. First, journal publications typically go through a rigorous peer-review process to ensure the quality of manuscripts for publication acceptance based on certain criteria. In educational research, some common criteria being used include “Standards for Reporting on Empirical Social Science Research in AERA Publications” (AERA, 2006 ), “Standards for Reporting on Humanities-Oriented Research in AERA Publications” (AERA, 2009 ), and “Scientific Research in Education” (NRC, 2002 ). Although the peer-review process is also employed in assessing and selecting proposals or papers for publication acceptance in other formats such as books and conference proceedings, the peer-review process employed by journals (esp. those reputable and top journals in a field) tends to be more rigorous and selective than other publication formats. Second, the impact of journals and their publications has frequently been evaluated by peers and different indexing services for inclusion, such as Clarivate’s Social Sciences Citation Index (SSCI) and Elsevier’s Scopus. The citation information collected and evaluated by indexing services provides another important measure about the quality and impact of selected journals and their publications. Based on these considerations, we decided to select and review those journal publications that can be identified as having high citations to gain an overview of their impact on the research development of STEM education.

Focusing on the selection and review of journal publications with high citations has also been used by many other scholars. For example, Martín‐Páez et al. ( 2019 ) conducted a literature review to examine how STEM education is conceptualized, used, and implemented in educational studies. To ensure the quality of published articles for review, they searched and selected journal articles published in the 2013–2018 period from the Web of Science (WoS) database only. Likewise, Akçayır and Akçayır ( 2017 ) conducted a systematic literature review on augmented reality used in educational settings. They used keywords to search all SSCI-indexed journals from WoS database to identify and select published articles, given that WoS provides easy access to search SSCI indexed articles. In addition to the method of searching the WoS database, some researchers used other approaches to identify and select published articles with high citations. For example, some researchers may search different databases to identify and select articles for reviews, such as Scopus (Chomphuphra et al., 2019 ) and Google (Godin et al., 2015 ). In comparison, however, the WoS core database is more selective than many others, including Scopus. The WoS is the world’s leading scientific citation search and analytical information platform (Li et al., 2018 ), and has its own independent and thorough editorial process to ensure journal quality together with the most comprehensive and complete citation network ( https://clarivate.com/webofsciencegroup/solutions/webofscience-ssci/ ). Its core database has been commonly used as a reliable indexing database with close attention to high standard research publications with a peer-review process and is thus used in many research review studies (e.g., Akçayır & Akçayır, 2017 ; Li et al., 2018 ; Marín-Marín et al., 2021 ; Martín‐Páez et al., 2019 ).

It should be noted that some researchers have used a different approach to identify and select high impact publications other than focusing on article citations. This alternative approach is to identify leading journals from specific fields first and then select relevant articles from these journals. For example, Brown ( 2012 ) identified and selected eight important journals in each STEM discipline after consulting with university faculty and K-12 teachers. Once these journals were selected, Brown then located 60 articles that authors self-identified as connected to STEM education from over 1100 articles published between January 1, 2007 and October 1, 2010. However, as there was no well-established journal in STEM education until just a few years ago (Li et al., 2020a ), the approach used by Brown may be less useful for identifying high impact publications in the field of STEM education. In fact, researchers in STEM education have been publishing their high-quality articles in many different journals, especially those well-established journals with an impact factor. Thus, this approach will not help ensure the selection of high impact articles in STEM education, even though they were selected from well-recognized journals rooted in each of STEM disciplines.

In summary, we searched the WoS core database to identify and select high impact empirical research articles in STEM education as those highly cited articles published in journals indexed and collected in the WoS.

Current review

Similar to previous research reviews (e.g., Li et al., 2020a ), we need to specify the scope of the current review with specific considerations of the following two issues:

What time period should be considered?

How should we identify and select highly cited research publications in STEM education?

Time period

As discussed in a previous review (Li et al., 2020a ), the acronym STEM did not exist until the early 2000s. The existence of the acronym has helped to focus attention on and efforts in STEM education. Thus, consistent with the determination of the time period used in the previous review on examining the status and trends in STEM education, we decided to select articles starting from the year 2000. At the same time, we can use the acronym of STEM as an identifier in locating journal articles in a way as done before (Li et al., 2020a ) and also by others (e.g., Brown, 2012 ; Mizell & Brown, 2016 ). We chose the end of 2021 as the end of the time period for publication search and inclusion.

Searching and identifying highly cited empirical research journal publications in STEM education

To identify and select journal articles in STEM education from the WoS core database, we decided to use the common approach of keyword searches as used in many other reviews (e.g., Gladstone & Cimpian, 2021 ; Winterer et al., 2020 ). Li et al. ( 2020a ) also noted the complexity and ambiguity of identifying publications in STEM education. Thus, we planned to identify and select publications in STEM education as those self-identified by authors. As mentioned above, we then used the acronym STEM (or STEAM) as key terms in our search for publications in STEM education.

Different from the previous review on research status and trends in STEM education (Li et al., 2020a ), the current review aimed to identify and select high impact journal articles but not coverage. Thus, we decided to define and limit the scope of high impact empirical research journal publications as the top 100 most-cited empirical research journal publications obtained from the WoS core database.

Research questions

Li et al. ( 2020a ) showed that STEM education articles have been published in many different journals, especially with the limited journal choices available in STEM education. Given a broader range of journals and a longer period of time to be covered in this review, we can thus gain some insights through examining multiple aspects of the top 100 most-cited empirical studies, including journals in which these empirical studies were published, publication years, disciplinary content coverage, research topics and methods. In addition, recent reviews suggested the value of examining possible trends in the authorship and school level focus (Li, 2022 ; Li & Xiao, 2022 ). Taken together, we are interested in addressing the following six research questions:

What are the top 100 most-cited empirical STEM education research journal publications?

What are the distributions and patterns of the top 100 most-cited empirical research publications in different journals?

What is the disciplinary content coverage of the top 100 most-cited empirical research journal publications and possible trends?

What are research topics and methods of the top 100 most-cited empirical research journal publications?

What are the corresponding authors’ nationalities/regions and professions?

What are school level foci of the top 100 most-cited empirical research journal publications over the years?

Based on the above discussion, we carried out the following steps for this systematic review to address these research questions.

Searching and identifying the top 100 most-cited empirical research journal publications in STEM education

Figure  1 provides a summary of the article search and selection process that was used for this review. The process started with a search of the WoS core database on September 12, 2022 under the field of “topic” (covering title, abstract, author keywords, and keywords plus), using the search terms: “STEM” OR “STEAM” OR “science, technology, engineering, and mathematics”. Because there are many different categories in the WoS database, we then specified the publication search using the four WoS categories listed under “education”: “Education Educational Research,” “Education Scientific Disciplines,” “Psychology Educational,” and “Education Special.” The time period of publication search was further specified as starting from 2000 to 2021.

figure 1

Flowchart of publication search, identification, and selection process

The search returned 9275 publications under “Education Educational Research,” 2161 under “Education Scientific Disciplines,” 247 under “Psychology Educational,” and 15 under “Education Special.” The combined list of all publications was then placed in descending order in terms of citation counts up to the search date of Sept. 12, 2022, and each publication record was screened one-by-one by three researchers using the inclusion or exclusion criteria (see Table 2 ). At times when the publication record listed was not detailed enough, we searched and obtained the full article to screen and check to determine its eligibility. The process ended after identifying and selecting the top 100 most-cited empirical research journal publications.

Data analysis

To address research question 3, we categorized all 100 publications in terms of the number of STEM disciplines covered in a study. Two general categories were used for this review: publications within a single discipline of STEM vs. those with multi- or inter-disciplines of STEM. In contrast to the detailed classifications used in a previous review (Li et al., 2020a ), this simplified classification can help reveal overall trends in disciplinary content coverage and approach reflected in high impact empirical research in STEM education.

To examine research topics, we used the same list of topics from previous reviews (Li & Xiao, 2022 ; Li et al., 2020a ). The following list contains the seven topic categories (TCs) that were used to classify and examine all 100 publications identified and selected from the search in this study.

TC1: Teaching, teacher, and teacher education in STEM (including both pre-service and in-service teacher education) in K-12 education;

TC2: Teacher and teaching in STEM (including faculty development, etc.) at post-secondary level;

TC3: STEM learner, learning, and learning environment in K-12 education;

TC4: STEM learner, learning, and learning environments (excluding pre-service teacher education) at post-secondary level;

TC5: Policy, curriculum, evaluation, and assessment in STEM (including literature reviews about a field in general);

TC6: Culture, social, and gender issues in STEM education;

TC7: History, epistemology, and perspectives about STEM and STEM education.

To examine research methods, we coded all publications in terms of the following methodological categories: (1) qualitative methods, (2) quantitative methods, and (3) mixed methods. We assigned each publication to only one research topic and one method, following the process used in the previous reviews (Li et al., 2019 , 2020a ). When there was more than one topic or method that could have been used for a publication, a final decision was made in choosing and assigning the primary topic and/or method after discussion.

To address research question 5, we examined the corresponding author’s (or the first author, if no specific indication was given about the corresponding author) nationality/region and profession. Many publications in STEM education have joint authorship but may contain limited information about different co-authors. Focusing on the corresponding author’s nationality/region is a feasible approach as we learned from a previous research review (Li et al., 2020a ). For the corresponding author’s profession, we used the same two general categories from the recent reviews (Li, 2022 ; Li & Xiao, 2022 ): “education” and “STEM+” that differentiate a corresponding author’s profession in education/educational research vs. disciplines and fields other than education. If a publication’s corresponding author was listed as affiliated with multiple departments/institutions, the first department/institution affiliation was chosen and used to identify the author’s nationality/region and profession.

To answer research question 6, we adopted the three categories from recent research reviews: K-12, postsecondary, and general (Li, 2022 ; Li & Xiao, 2022 ). The use of these school level categories helped reveal the distribution of STEM education research interests and development over the school level span. While the first two categories are self-explanatory, the “general” category is for those empirical research publications on questions or issues either pertinent to all school levels or that cross the boundary of K-12 school and college.

Results and discussion

The following sections are structured to report findings as corresponding to each of the six research questions.

Top 100 most-cited empirical research articles from 2000 to 2021

Figure  2 shows the distribution of the top 100 most-cited empirical research journal publications in STEM education over the years 2000–2021. As the majority of these publications (72 out of 100, 72%) were published between 2011 and 2016, the results suggest that publications typically need about 5–10 years to accumulate high enough citations for inclusion. Research articles published more than 10 years ago would likely become out-of-dated, unless those studies have been recognized as classic in the field. Some recent publications (6 publications, 2018–2019) emerged with high citations could suggest the emergency of interesting ‘hot’ topics in the field.

figure 2

Distribution of the top 100 most-cited empirical research publications over the years (Note: all 100 of these most-cited publications were published in the years 2005-2019.)

To have a more fine-grained sense of these highly cited research articles, we took a more detailed look at the top ten most-cited publications from the search (see Table 3 ). These ten most-cited publications were published between 2005 and 2014, with an average of 337 citations and a range of 238–820 citations per article. Only two of the top ten articles were published before 2010; both gained very impressive citations over the years (820 citations for the article published in 2009 and 289 citations for the other published in 2005). The on-going high citations of these two research articles are clear indication of their impact and importance in the field.

Table 3 also shows that the top ten list of most-cited empirical research articles were published in six different journals, with the majority of these journals focusing on general educational research or educational psychology. The importance of STEM education research was clearly recognized with high impact publications in these well-established journals. At the same time, the results imply the rapid development of STEM education research in its early stages and the value of examining possible trends in journals that published high impact articles in STEM education over time.

Moreover, we noticed that all of these top ten articles had corresponding authors who were from the U.S., with the exception of one by researchers in the U.K. This result is consistent with what we learned from previous reviews of STEM education research publications (Li et al., 2019 , 2020a ). About 75% of STEM-related journal publications were typically contributed by U.S. scholars, either in this journal’s publications from 2014 to 2018 (Li et al., 2019 ) or publications from 36 journals from 2000 to 2018 (Li et al., 2020a ). It is not surprising that all of these high impact research publications from 2005 to 2014 were contributed by researchers in the West, especially the United States. (Below we report more about the corresponding authorship of the 100 high impact research publications beyond the top 10 that are reported here.)

Distributions and patterns of highly cited publications across different journals

Forty-five journals were identified as publishing these top 100 most-cited articles. Table 4 shows that the majority (26) of these journals focus on general educational research or educational psychology, publishing 52 of the top 100 most-cited articles. Fourteen journals with titles specifying a single discipline of STEM published 38 of these top 100 articles, three journals with two specified STEM disciplines in their titles published seven of these articles, one journal with three specified STEM disciplines published one article, and one journal specifying all four STEM disciplines published two articles. Among these 45 journals, 36 journals are indexed in SSCI, with the remaining nine journals indexed in ESCI (Emerging Sources Citation Index). These are clearly all reputable and well-established journals, with 36 established before 2000 and 9 established in or after 2000. Only three journals in the list are Open Access (OA) journals, and they were all established after 2000. The results suggest that researchers have been publishing high impact STEM education research articles in a wide range of well-established traditional journals, with the majority in general educational research or educational psychology with a long publishing history. It further confirms that the importance of STEM education research has been well-recognized in educational research or educational psychology as noted above. At the same time, the results imply that the history of STEM education itself has been too brief to establish its own top journals and identity except only one in STEM education (IJSTEM) (Li et al., 2020a ).

Among these 45 journals listed in Table 4 , we classified them into two general categories: general education research journals (26, all without mention of a discipline of STEM in a journal’s title) and those (19) with one or more STEM disciplines specified in a journal’s title. Figure  3 presents the distributions of these top 100 articles in these two general categories over the years. Among 49 articles published before 2014, the majority (31, 63%) of these articles were published in journals on general educational research or educational psychology. However, starting in 2014, a new trend emerged with more of these highly cited articles (30 out of 51, 59%) published in journals with STEM discipline(s) specified. The result suggests a possible shift of developing and gaining disciplinary content consciousness in STEM education research publications.

figure 3

Trend of the top 100 most-cited articles published in journals without vs. with subject discipline(s) of STEM specified. (Note: 0 = journals without STEM discipline specified, 1 = journals with STEM discipline(s) specified.)

As a further examination of the distribution of publications in journals specified with STEM discipline(s), Fig.  4 shows the distributions of these highly cited articles in different journal categories over the years. It is clear that these highly cited articles were typically published in journals on general educational research or educational psychology before 2014. However, things started to change since 2014, with these highly cited articles published in more diverse journals including those with STEM discipline(s) specified in the journal titles. The journals that include only a single discipline of STEM have been more popular than others among those journals that specify one or more STEM disciplines. The result is not surprising as journals specified with a single discipline of STEM are more common, often with a long publishing history and support from well-established professional societies of education on a single discipline of STEM. This trend suggests that the importance of STEM education has also gained increasing recognition from professional societies that used to focus on a single discipline of STEM.

figure 4

Distribution of highly cited research articles across different journal categories over the years. (Note: 0 = journals without STEM discipline specified, 1 = journals with a single discipline of STEM specified, 2 = journals with two disciplines of STEM specified, 3/4 = journals with 3 or 4 disciplines of STEM specified.)

To glimpse into those recent changes, we took a closer look at the six articles published in 2018 and 2019 as examples (see Table 5 ). All of these articles have been highly cited in just 3 or 4 years, with an average of 102 citations (range, 75–144) per article. Across these six articles, the majority were published in journals whose titles specified one or more STEM disciplines: three in journals with a single discipline of STEM specified, one in a journal on STEM education, and two in journals on general educational research. At the same time, these recent publications are not specifically on any single discipline of STEM, but multi- and interdisciplinary STEM education.

Disciplinary content coverage

The search of STEM education publications from the WoS core database relied on several keywords that the authors used to self-identify their research on STEM education. After coding and categorizing all top 100 publications, 25 research publications were found as focusing on a single discipline of STEM and 75 publications on multi- and interdisciplinary STEM education. The majority of these 100 most-cited empirical studies, in their focus on multi- and interdisciplinary STEM education, reflects the overall focus in STEM education, a trend consistent with what was learned from a previous review of journal publications in STEM education (Li et al., 2020a ).

Among the 25 research articles on a single discipline of STEM, the majority of these articles (56%, 14 out of 25) focused on science, 5 articles on technology, 4 articles on mathematics, and 2 articles on engineering. The result suggests that of the four STEM disciplines, arguably “science” is the broadest category and so it is not surprising that the number of publications on science is the most prevalent. Indeed, the result is also consistent with what we can learn from Table 4 . Among the 14 journals specifying a single STEM discipline that published 38 of the top 100 articles, seven journals focus on “science” that published 27 of these 38 articles.

To examine possible trends over time, Fig.  5 shows the distribution of these 100 articles across these two disciplinary content coverage categories over the years. For each of the publishing years from 2005 to 2019, there were always more high citation empirical publications on multi- and interdisciplinary STEM education than high citation publications focusing on a single discipline of STEM. Moreover, there were no high citation publications on a single discipline of STEM before 2011 or after 2017 that made the cut for inclusion in the top 100 list. These results suggest an overall trend of on-going emphasis on multi- and interdisciplinary research in STEM education, which can be further verified by what we learned from the six recent publications in Table 5 .

figure 5

Publication distribution by disciplinary content coverage over the years. (Note: S = single discipline of STEM, M = multiple disciplines of STEM.)

Research topics and methods

Table 6 presents the distribution of all 100 highly cited publications classified in terms of the seven topic categories (TCs) over the years. Overall, all seven TCs have publications that were on the top 100 high citation publication list. There were clearly the most publications on TC6 (culture, social, and gender issues in STEM education), followed by publications on TC4 (STEM learner, learning, and learning environments at post-secondary level). The large number of publications with high citations in these two categories suggest possible evolution of research interests and topics in the field of STEM education. Taking a closer look at the six recent publications in Table 5 , it is clear that culture, social, and gender issues were the focus in these recent publications, with the exception of one publication on assessment. This result presents a picture that appears somewhat different from what we learned from previous research reviews that did not focus exclusively on high impact publications from the WoS database (Li & Xiao, 2022 ; Li et al., 2020a ).

Looking at the distribution of these publications within each of the seven TCs, “culture, social, and gender issues in STEM education” (TC6) is a topic area that consistently has some highly cited research publications in almost each of the publishing years. “STEM learner, learning, and learning environments at post-secondary level” (TC4) also has some consistent and on-going research interest with highly cited publications making the list in most of these publishing years. In contrast, publication distributions in the rest of the TCs did not present clearly notable patterns over the years.

Figure  6 shows the number of publications distributed over the years by research methods in these empirical studies. The use of quantitative methods (71) is dominant overall and is especially prevalent among these most-cited publications in the years from 2005 to 2019, a result consistent with what we learned from a previous research review (Li et al., 2020a ). Across these three methodological classifications, qualitative methods were used in 20 empirical studies, and mixed methods were used in only 9 empirical studies. Comparatively, there were many more articles published between 2010 and 2016 that used quantitative methods than the other two methods. However, there were somehow less dramatic differences in method use among empirical studies published either before 2010 or after 2016. As the use of different methods can help reveal ways of collecting and analyzing data to provide empirical evidence, it would be interesting to learn more about possible development and use of research methods in STEM education in the future as a new empirical research paradigm.

figure 6

Publication distribution in terms of research methods over the years. (Note: 1 = qualitative, 2 = quantitative, 3 = mixed.)

Corresponding author’s nationality/region and profession Footnote 1

Examining the corresponding author’s nationality/region helps reveal the international diversity in research engagement and scholarly contribution to STEM education. Figure  7 indicates 87 highly cited publications (87%, out of 100 publications) with the corresponding author from the United States, followed by 6 publications (6%) contributed by researchers in the U.K., and the remaining 7 publications with the corresponding author from seven other countries/regions (i.e., one publication for each country/region). The results show some international diversity in terms of the number of country/region represented, but with a clear dominance of research contributions from the West especially the United States. The result echoes what we learned above about the corresponding author’s nationality/region for the top ten most-cited articles (see Table 3 ).

figure 7

Distribution of corresponding author’s nationality/region of the top 100 articles

Recent reviews of journal publications in IJSTEM suggest a trend of increasing diversity in research contributions from many more different countries/regions (Li, 2022 ; Li & Xiao, 2022 ). We would not be surprised if the list of top 100 most-cited empirical research publications contained more contributions from other countries/regions in the future.

After coding the corresponding author’s profession in these top 100 articles, we found that similar numbers of publications had corresponding authors who were researchers in education (49) and STEM+ (51). This result is consistent with what we learned from the corresponding authors’ profession distribution in recent publications in IJSTEM (Li, 2022 ). The diversity in contributing to STEM education scholarship from researchers with various disciplinary training is evident.

To examine possible trends in the corresponding authors’ profession over time, Fig.  8 shows the distributions of these publications in the two profession categories over the years. It is interesting to note that researchers in education typically served as the corresponding authors for more articles published before 2014: 31 articles by researchers in education and 18 articles by researchers in STEM+ for a total of 49 published before 2014. However, a new trend has emerged since 2014, with many more researchers in STEM+ serving as the corresponding authors for these highly cited research articles: 18 articles by researchers in education and 33 articles by researchers in STEM+ for a total of 51 published since 2014.

figure 8

Distribution of publications by corresponding author’s profession over the years. (Note: 1 = education, 2 = STEM+)

This trend is consistent with what we learned above about the increased number of these publications in journals specified with STEM discipline(s) since 2014 (see Figs. 3 and 4 ). We see an increasing number of researchers in STEM+ fields contributing and publishing empirical research articles in many journals associated with STEM discipline(s) since 2014, resulted in an increase in citations from professional communities while furthering the development of STEM education scholarship. The result is also consistent with what we learned from the authorship development of publications in IJSTEM over the years (Li & Xiao, 2022 ), an increasing trend of having STEM education scholarship contributions from diverse STEM+ fields.

Publications by school level over the years

With an increasing trend of contributions from researchers in diverse STEM+ fields, the identification of school level can help reveal where these high impact research publications focus on issues in STEM education. The coding results show that the majority (63) of these 100 most-cited articles focused on issues at the postsecondary level, 30 articles on issues at the K-12 school level, and 7 articles in the category of “general.”

Figure  9 presents the distributions of these highly cited publications across these three school categories over the years. It is interesting to note that high impact publications on issues at the postsecondary level outnumbered those in other two categories in almost every of these publishing years. As educational issues in K-12 school level were typically attended to by researchers in education, the increasing number of contributions from researchers in diverse STEM+ fields likely pushed the number of citations on publications that fit their interests more at the postsecondary level. The result is consistent with a growing trend in IJSTEM publications on STEM education at the post-secondary level revealed in a recent review (Li & Xiao, 2022 ).

figure 9

Distribution of highly cited publications by school level focus and year. (Note: 1 = K-12 school level, 2 = Post-secondary level, 3 = General.)

We also noticed that almost no articles in the category of “general” before 2011 and after 2015 made to the list of top 100 most-cited publications. This result suggests that high impact empirical research in STEM education was conducted more at the school level rather than on issues across the boundary of K-12 school and college. With an increasing number of publications in the “general” category noted in recent review of IJSTEM publications (Li & Xiao, 2022 ), it would be interesting to learn more about cross-school boundary development of STEM education scholarship in the future.

Concluding remarks

This systematic review of high impact empirical studies in STEM education explores the top 100 most-cited research articles from the WoS database as published in journals from 2000 to 2021. These articles were published in a wide range of 45 reputable and well-established journals, typically with a long publishing history. These publications present an overall emphasis more on multi- and interdisciplinary STEM education rather than a single discipline of STEM, with an increasing trend of publishing in journals whose title specified one or more STEM discipline(s). Before 2014, 37% (18 out of 49) of these most-cited articles were published in journals whose title specified with a STEM discipline(s). In contrast, 59% (30 out of 51) articles were published in such journals since 2014, and even more so with 67% of the six articles published in 2018 and 2019. This trend is further elevated with two of those high impact articles recently published in this journal, International Journal of STEM Education . There appears a growing sense of developing disciplinary content consciousness and identity in STEM education.

Consistent with our previous reviews (Li et al., 2019 , 2020a ), the vast majority of these highly cited STEM research publications were contributed by authors from the West, especially the United States where STEM and STEAM education originated. Although there were contributions from eight other countries/regions in these top 100 publications, the diversity of international engagement and contribution was limited. Our results also provide an explanation of what may become “hot” topics among these highly cited articles. In particular, the topic of “culture, social, and gender issues in STEM education” is quite prevalent among those highly cited research publications, followed by the topic area of “STEM learner, learning, and learning environments at post-secondary level.” In comparison, topics related to disciplinary content integration in STEM teaching and learning and STEM teacher training have not yet emerged as “hot” among these highly cited empirical studies. Given that an increasing trend of diversity was noted from a review of recent publications in IJSTEM (Li, 2022 ), we would not be surprised if there will be more high impact research publications contributed by researchers from many other countries/regions on diverse topics in the future.

As STEM education does not have a long history, there will be many challenges and opportunities for new development in STEM education. One important dimension is research method. Among the top 100 most-cited empirical studies, quantitative methods were used as the dominant approach, followed by qualitative methods and then mixed methods. This is not surprising as research in multidisciplinary STEM education may require the use and analysis of data across different disciplines, more frequently in large quantitative data than in other data formats. However, when research questions evolve in the future, it would be interesting to learn more about method development and use in STEM education as a new research paradigm.

We started this review with the intention of gaining insights into the development of STEM education scholarship beyond what we learned about publication growth in STEM education from prior reviews. Indeed, this systematic review provided us with the opportunity to learn about possible trends and gaps in different aspects as discussed above. At the same time, we can learn even more by making connections across these different aspects. One important question in STEM education is to understand the nature of STEM education scholarship and to find ways of developing STEM education scholarship. However, STEM is not a discipline by itself, which suggests possible fundamental differences between STEM education scholarship and scholarship typically defined and classified for a single discipline of STEM. With the increasing participation and contributions from researchers in diverse STEM+ fields as we learned from this review, there is a good possibility that the nature of STEM education scholarship will be collectively formulated with numerous contributions from diverse scholars. Continuing analyses of high impact publications is an important and interesting topic that can yield more insights in the years to come.

Availability of data and materials

The data and materials used and analyzed for the report were obtained through searching the Web of Science database, and related journal information are available directly from these journals’ websites.

Our analysis found that the vast majority (94%) of these top 100 articles had the same researcher to serve as the first author and the corresponding author. There are 10 articles that had more than one corresponding authors, and we chose the first corresponding author as listed in our coding.

Abbreviations

Association for computing machinery  AERA

American Educational Research Association

Cell biology education

Emerging Sources Citation Index

Institute of electrical and electronics engineers

International Journal of STEM Education

Kindergarten-Grade 12

National Research Council 

Social Sciences Citation Index

Science, technology, engineering, and mathematics

Disciplines or fields other than education, including those commonly considered under the STEM umbrella plus some others

Science, technology, engineering, arts, and mathematics

Topic category

Web of Science

Akçayır, M., & Akçayır, G. (2017). Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educational Research Review, 20 , 1–11.

Article   Google Scholar  

American Educational Research Association (AERA). (2006). Standards for reporting on empirical social science research in AERA publications. Educational Researcher, 35 (6), 33–40.

American Educational Research Association (AERA). (2009). Standards for reporting on humanities-oriented research in AERA publications. Educational Researcher, 38 (6), 481–486.

Barrasso, A. P., & Spilios, K. E. (2021). A scoping review of literature assessing the impact of the learning assistant model. International Journal of STEM Education, 8 , 12. https://doi.org/10.1186/s40594-020-00267-8

Brown, J. (2012). The current status of STEM education research. Journal of STEM Education: Innovations & Research, 13 (5), 7–11.

Google Scholar  

Chomphuphra, P., Chaipidech, P., & Yuenyong, C. (2019). Trends and research issues of STEM education: A review of academic publications from 2007 to 2017. Journal of Physics: Conference Series, 1340 (2019), 012069.

Erduran, S. (2020). Nature of “STEM”? Science & Education, 29 , 781–784. https://doi.org/10.1007/s11191-020-00150-6

Gao, X., Li, P., Shen, J., & Sun, H. (2020). Reviewing assessment of student learning in interdisciplinary STEM education. International Journal of STEM Education, 7 , 24. https://doi.org/10.1186/s40594-020-00225-4

Gladstone, J. R., & Cimpian, A. (2021). Which role models are effective for which students? A systematic review and four recommendations for maximizing the effectiveness of role models in STEM. International Journal of STEM Education, 8 , 59. https://doi.org/10.1186/s40594-021-00315-x

Godin, K., Stapleton, J., Kirkpatrick, S. I., Rhona, M., Hanning, R. M., & Leatherdale, S. T. (2015). Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada. Systematic Reviews, 4 , 138. https://doi.org/10.1186/s13643-015-0125-0

Jackson, C., Mohr-Schroeder, M. J., Bush, S. B., Maiorca, C., Roberts, T., Yost, C., & Fowler, A. (2021). Equity- Oriented Conceptual Framework for K-12 STEM literacy. International Journal of STEM Education, 8 , 38. https://doi.org/10.1186/s40594-021-00294-z

Li, K., Rollins, J., & Yan, E. (2018). Web of Science use in published research and review papers 1997–2017: A selective, dynamic, cross-domain, content-based analysis. Scientometrics, 115 (1), 1–20.

Li, Y. (2019). Five years of development in pursuing excellence in quality and global impact to become the first journal in STEM education covered in SSCI. International Journal of STEM Education, 6 , 42. https://doi.org/10.1186/s40594-019-0198-8

Li, Y. (2022). Eight years of development in welcoming and engaging diverse scholars to share and promote STEM education research worldwide. International Journal of STEM Education, 9 , 69. https://doi.org/10.1186/s40594-022-00385-5

Li, Y., Froyd, J. E., & Wang, K. (2019). Learning about research and readership development in STEM education: A systematic analysis of the journal’s publications from 2014 to 2018. International Journal of STEM Education, 6 , 19. https://doi.org/10.1186/s40594-019-0176-1

Li, Y., Wang, K., Xiao, Y., & Froyd, J. E. (2020a). Research and trends in STEM education: A systematic review of journal publications. International Journal of STEM Education, 7 , 11. https://doi.org/10.1186/s40594-020-00207-6

Li, Y., Wang, K., Xiao, Y., Froyd, J. E., & Nite, S. B. (2020b). Research and trends in STEM education: A systematic analysis of publicly funded projects. International Journal of STEM Education, 7 , 17. https://doi.org/10.1186/s40594-020-00213-8

Li, Y., & Xiao, Y. (2022). Authorship and topic trends in STEM education research. International Journal of STEM Education, 9 , 62. https://doi.org/10.1186/s40594-022-00378-4

Marín-Marín, J.-A., Moreno-Guerrero, A.-J., Dúo-Terrón, P., & López-Belmonte, J. (2021). STEAM in education: A bibliometric analysis of performance and co-words in Web of Science. International Journal of STEM Education, 8 , 41. https://doi.org/10.1186/s40594-021-00296-x

Martín-Páez, T., Aguilera, D., Perales-Palacios, F. J., & Vílchez-González, J. M. (2019). What are we talking about when we talk about STEM education? A review of literature. Science Education, 103 , 799–822.

Mizell, S., & Brown, S. (2016). The current status of STEM education research 2013–2015. Journal of STEM Education: Innovations & Research, 17 (4), 52–56.

National Research Council (NRC). (2002). Scientific research in education . Committee on Scientific Principles for Education Research. Shavelson, R. J., and Towne, L., Editors. Center for Education. Division of Behavioral and Social Sciences and Education. National Academy Press.

Nguyen, K. A., Borrego, M., Finelli, C. J., DeMonbrun, M., Crockett, C., Tharayil, S., Shekhar, P., Waters, C., & Rosenberg, R. (2021). Instructor strategies to aid implementation of active learning: a systematic literature review. International Journal of STEM Education , 8 , 9. https://doi.org/10.1186/s40594-021-00270-7

Reinholz, D. L., White, I. & Andrews, T. (2021). Change theory in STEM higher education: a systematic review. International Journal of STEM Education , 8 , 37. https://doi.org/10.1186/s40594-021-00291-2

Simpson, A., & Bouhafa, Y. (2020). Youths’ and adults’ identity in STEM: A systematic literature review. Journal for STEM Education Research, 3 , 167–194.

Takeuchi, M. A., Sengupta, P., Shanahan, M.-C., Adams, J. D., & Hachem, M. (2020). Transdisciplinarity in STEM education: A critical review. Studies in Science Education, 56 (2), 213–253.

Tytler, R. (2020). STEM education for the twenty-first century. In J. Anderson & Y. Li (Eds.), Integrated approaches to STEM education: An international perspective (pp. 21–43). Springer.

Chapter   Google Scholar  

Wahono, B., Lin, P. L. & Chang, C. Y. (2020). Evidence of STEM enactment effectiveness in Asian student learning outcomes. International Journal of STEM Education , 7 , 36. https://doi.org/10.1186/s40594-020-00236-1

Wan, Z. H., Jiang, Y., & Zhan, Y. (2021). STEM education in early childhood: A review of empirical studies. Early Education and Development, 32 (7), 940–962. https://doi.org/10.1080/10409289.2020.1814986

Winterer, E. R., Froyd, J. E., Borrego, M., Martin, J. P., & Foster, M. (2020). Factors influencing the academic success of Latinx students matriculating at 2-year and transferring to 4-year US institutions—implications for STEM majors: A systematic review of the literature. International Journal of STEM Education, 7 , 34. https://doi.org/10.1186/s40594-020-00215-6

Zhao, F. & Schuchardt, A. (2021). Development of the Sci-math Sensemaking Framework: categorizing sensemaking of mathematical equations in science. International Journal of STEM Education , 8 , 10. https://doi.org/10.1186/s40594-020-00264-x

Download references

Acknowledgements

The author would like to thank Marius Jung and the staff at SpringerOpen for their support in publishing this article.

This work was supported by National Social Science Foundation of China, BHA180134.

Author information

Authors and affiliations.

Texas A&M University, College Station, TX, 77843, USA

Yeping Li, Yu Xiao & Sandra B. Nite

Nicholls State University, Thibodaux, USA

Tianjin Normal University, Tianjin, China

Southwest University, Chongqing, China

Shanghai Normal University, Shanghai, China

Capital Normal University, Beijing, China

Ruilin Wang

Beijing Normal University, Beijing, China

Hunan Normal University, Changsha, China

Zhiqiang Yuan

Yangzhou University, Yangzhou, China

Jianxing Xu

Harvard University, Cambridge, USA

Jon R. Star

You can also search for this author in PubMed   Google Scholar

Contributions

YL conceived the study, helped with article search and screening, conducted data analyses, and drafted the manuscript. YX and KW contributed with article search, identification, selection and coding. NZ, YP, RW, CQ, ZY, and JX contributed with data coding. SBN and JRS reviewed drafts and contributed to manuscript revisions. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Yeping Li .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, Y., Xiao, Y., Wang, K. et al. A systematic review of high impact empirical studies in STEM education. IJ STEM Ed 9 , 72 (2022). https://doi.org/10.1186/s40594-022-00389-1

Download citation

Received : 27 November 2022

Accepted : 01 December 2022

Published : 22 December 2022

DOI : https://doi.org/10.1186/s40594-022-00389-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Empirical studies
  • Research review
  • STEM education scholarship

empirical studies in education

empirical studies in education

Empirical Research in the Social Sciences and Education

What is empirical research.

  • Finding Empirical Research
  • Designing Empirical Research
  • Ethics & Anti-Racism in Research
  • Citing, Writing, and Presenting Your Work

Academic Services Librarian | Research, Education, & Engagement

Profile Photo

Gratitude to Penn State

Thank you to librarians at Penn State for serving as the inspiration for this library guide

An empirical research article is a primary source where the authors reported on experiments or observations that they conducted. Their research includes their observed and measured data that they derived from an actual experiment rather than theory or belief. 

How do you know if you are reading an empirical article? Ask yourself: "What did the authors actually do?" or "How could this study be re-created?"

Key characteristics to look for:

  • Specific research questions  to be answered
  • Definition of the  population, behavior, or phenomena  being studied
  • Description of the  process or methodology  used to study this population or phenomena, including selection criteria, controls, and testing instruments (example: surveys, questionnaires, etc)
  • You can readily describe what the  authors actually did 

Layout of Empirical Articles

Scholarly journals sometimes use a specific layout for empirical articles, called the "IMRaD" format, to communicate empirical research findings. There are four main components:

  • Introduction : aka "literature review". This section summarizes what is known about the topic at the time of the article's publication. It brings the reader up-to-speed on the research and usually includes a theoretical framework 
  • Methodology : aka "research design". This section describes exactly how the study was done. It describes the population, research process, and analytical tools
  • Results : aka "findings". This section describes what was learned in the study. It usually contains statistical data or substantial quotes from research participants
  • Discussion : aka "conclusion" or "implications". This section explains why the study is important, and also describes the limitations of the study. While research results can influence professional practices and future studies, it's important for the researchers to clarify if specific aspects of the study should limit its use. For example, a study using undergraduate students at a small, western, private college can not be extrapolated to include  all  undergraduates. 
  • Next: Finding Empirical Research >>
  • Last Updated: Nov 8, 2023 4:19 PM
  • URL: https://libguides.stthomas.edu/empiricalresearcheducation

© 2023 University of St. Thomas, Minnesota

IMAGES

  1. Empirical Research: Definition, Methods, Types and Examples

    empirical studies in education

  2. 15 Empirical Evidence Examples (2024)

    empirical studies in education

  3. What is empirical research

    empirical studies in education

  4. Examining Empirical Foundations in Education

    empirical studies in education

  5. (PDF) Educational Technology Professional Development in Higher

    empirical studies in education

  6. Empirical Research: Definition, Methods, Types and Examples

    empirical studies in education

VIDEO

  1. lesson 3 5 empirical and molecular formulas

  2. Empirical Studies Qualitative vs Quantitative

  3. Academic Writing Empirical Studies

  4. Academic Writing Empirical Studies

  5. Research Methods

  6. Why Empirical Facts Are the Lowest Form of Knowledge

COMMENTS

  1. Empirical Education Research on the Effectiveness and Quality ...

    The conceptual framework in Table 3.3 serves as an initial blueprint orienting and informing the case study design, data collection, and analysis process in the context of the empirical case study research conducted in three different higher education classrooms at the HGSE (Yin, 2009).

  2. Empirical Research in the Social Sciences and Education

    Empirical research is based on observed and measured phenomena and derives knowledge from actual experience rather than from theory or belief. How do you know if a study is empirical? Read the subheadings within the article, book, or report and look for a description of the research "methodology."

  3. Empirical Research Methods in Education: A Brief Review

    The current paper critically reviews research methods most commonly used by empirical researchers when analysing education policy issues. The review is confined to evaluations of the internal efficiency of schools and does not look at methods employed to study the external (e.g., labour market) impact of school policies.

  4. A systematic review of high impact empirical studies in STEM ...

    The formation of an academic field is evidenced by many factors, including the growth of relevant research articles and the increasing impact of highly cited publications. Building upon recent scoping reviews of journal publications in STEM education, this study aimed to provide a systematic review of high impact empirical studies in STEM education to gain insights into the development of STEM ...

  5. Studies in Educational Evaluation | Journal | ScienceDirect ...

    About the journal. Studies in Educational Evaluation publishes original reports of evaluation studies. Four types of articles are published by the journal: (a) Empirical evaluation studies representing assessment and evaluation practice in educational systems around the world; (b) Theoretical reflections and …. View full aims & scope.

  6. Empirical Research in the Social Sciences and Education

    While research results can influence professional practices and future studies, it's important for the researchers to clarify if specific aspects of the study should limit its use. For example, a study using undergraduate students at a small, western, private college can not be extrapolated to include all undergraduates.

  7. Full article: Traces of embodied teaching and learning: a ...

    They were assessed for relevance according to the following inclusion criteria: (1) published in a scientific journal, (2) peer-reviewed, (3) addressing the topic of embodied teaching and learning, (4) situated in higher education/teacher education, (5) based on empirical studies and (6) written in English, French or a Scandinavian language.

  8. Full article: Empirical Support for Establishing Common ...

    This paper highlights the frequency of empirical assumptions made in the education literature and proposes a set of harmonized assumptions to address empirical uncertainty that can be used to increase comparability of economic evaluation across programs and across studies.