Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

fostering critical thinking

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Images shows a stylized artistic landscape with soothing colors.

Sign Up & Sign In

module image 9

The Institute for Learning and Teaching

College of business, teaching tips, the socratic method: fostering critical thinking.

"Do not take what I say as if I were merely playing, for you see the subject of our discussion—and on what subject should even a man of slight intelligence be more serious? —namely, what kind of life should one live . . ." Socrates

By Peter Conor

This teaching tip explores how the Socratic Method can be used to promote critical thinking in classroom discussions. It is based on the article, The Socratic Method: What it is and How to Use it in the Classroom, published in the newsletter, Speaking of Teaching, a publication of the Stanford Center for Teaching and Learning (CTL).

The article summarizes a talk given by Political Science professor Rob Reich, on May 22, 2003, as part of the center’s Award Winning Teachers on Teaching lecture series. Reich, the recipient of the 2001 Walter J. Gores Award for Teaching Excellence, describes four essential components of the Socratic method and urges his audience to “creatively reclaim [the method] as a relevant framework” to be used in the classroom.

What is the Socratic Method?

Developed by the Greek philosopher, Socrates, the Socratic Method is a dialogue between teacher and students, instigated by the continual probing questions of the teacher, in a concerted effort to explore the underlying beliefs that shape the students views and opinions. Though often misunderstood, most Western pedagogical tradition, from Plato on, is based on this dialectical method of questioning.

An extreme version of this technique is employed by the infamous professor, Dr. Kingsfield, portrayed by John Houseman in the 1973 movie, “The Paper Chase.” In order to get at the heart of ethical dilemmas and the principles of moral character, Dr. Kingsfield terrorizes and humiliates his law students by painfully grilling them on the details and implications of legal cases.

In his lecture, Reich describes a kinder, gentler Socratic Method, pointing out the following:

  • Socratic inquiry is not “teaching” per se. It does not include PowerPoint driven lectures, detailed lesson plans or rote memorization. The teacher is neither “the sage on the stage” nor “the guide on the side.” The students are not passive recipients of knowledge.
  • The Socratic Method involves a shared dialogue between teacher and students. The teacher leads by posing thought-provoking questions. Students actively engage by asking questions of their own. The discussion goes back and forth.
  • The Socratic Method says Reich, “is better used to demonstrate complexity, difficulty, and uncertainty than to elicit facts about the world.” The aim of the questioning is to probe the underlying beliefs upon which each participant’s statements, arguments and assumptions are built.
  • The classroom environment is characterized by “productive discomfort,” not intimidation. The Socratic professor does not have all the answers and is not merely “testing” the students. The questioning proceeds open-ended with no pre-determined goal.
  • The focus is not on the participants’ statements but on the value system that underpins their beliefs, actions, and decisions. For this reason, any successful challenge to this system comes with high stakes—one might have to examine and change one’s life, but, Socrates is famous for saying, “the unexamined life is not worth living.”
  • “The Socratic professor,” Reich states, “is not the opponent in an argument, nor someone who always plays devil’s advocate, saying essentially: ‘If you affirm it, I deny it. If you deny it, I affirm it.’ This happens sometimes, but not as a matter of pedagogical principle.”

Professor Reich also provides ten tips for fostering critical thinking in the classroom. While no longer available on Stanford’s website, the full article can be found on the web archive:  The Socratic Method: What it is and How to Use it in the classroom

  • More Teaching Tips
  • Tags: communication , critical thinking , learning
  • Categories: Instructional Strategies , Teaching Effectiveness , Teaching Tips

socrates statue

Center for Teaching

The teaching exchange: fostering critical thinking.

This article was originally published in the Fall 1999 issue of the CFT’s newsletter, Teaching Forum.

The Teaching Exchange is a forum for teachers at Vanderbilt to share their pedagogical strategies, experiments, and discoveries. Every issue will highlight innovations in teaching across the campus. This ‘exchange’ offers strategies from several different instructors for fostering critical thinking among students.

George Becker , Associate Professor of Sociology There are two general approaches that I find helpful in producing a classroom setting conductive to critical inquiry. These involve 1) the establishment of an environment in which both parties, student and teacher, function as partners in inquiry, and 2) the employment of a set of questioning strategies specifically geared to the acquisition of higher-order thinking and reasoning skills.

Central to making students feel they are partners in a community of learners is the creation of a climate of trust, so that students feel safe in offering their own ideas. I try to foster a sense of “we-feeling” by asking, for example, “How can we explain this development? What does it mean to us?” Using plural pronouns creates a dialogue that has less of an adversarial tone and underscores the idea of students and teachers as partners in inquiry. I have also found that learning student names as quickly as possible is essential for developing trust. At the beginning of each semester, I ask everyone to bring me a small snapshot (photocopied student IDs work well), and I can review the photos prior to each class. Student compliance is, of course, voluntary.

I give students a rationale for the value of an interactive classroom. I assure them that interaction is not designed to embarrass them, but rather to facilitate learning and make the subject matter more interesting. This lets students know they have some control over class proceedings and that their insights and contributions will be validated in our mutual quest for understanding.

One particularly effective strategy, adopted from my colleague Larry Griffin, is to provide students with the option to “pass” on a particular question. Interestingly, I find that while students welcome this option, they rarely invoke it. It does serve as a motivator as well as an opportunity to exercise reasonable decision making. Another key ingredient is the element of humor. Laughter causes the release of certain chemicals in the brain that help build long-term memory. I try to let humor evolve naturally from content-related dialogue and present it in a good-natured fashion.

The practice of questioning is also central to the development of critical thinking. There are two relatively simple strategies, dealing with aspects of the question-and-answer sequence, that I have found work well: 1) careful design of the questions, and 2) providing sufficient response time.

Preparing for a class, I construct several pivotal questions that address the key facts and concepts of the lesson. These are designed to help students apply their knowledge and understanding of the course content at the levels of analysis, synthesis, and evaluation. Questions seeking to engage students in analysis usually contain such words as interpret, discover, compare, and contrast. Questions designed to facilitate synthesis usually contain such words as imagine, formulate, generalize, of hypothesize. Questions directed toward evaluative thought generally contain terms such as judge, assess, revise, and criticize.

To maximize the value of questioning, the issue of response time is critical. From the teacher’s point of view, it is considered “wait” time, while for the student, it is “think” time, the time it takes to formulate a response. I make it a practice to build in two specific blocks of wait/think time for each questioning episode. Once I ask a pivotal question, I try to remain silent for three to five seconds to allow students to formulate their answers. When a student responds, I pause for a second time, again without comment of reaction. This prompts further thought and comment on the part of the student and provides an opportunity for others in the class to continue thinking of additional responses.

Leonard Folgarait , Professor and Chair of Fine Arts The following is excerpted from a presentation on “Effective Learning Strategies” offered as part of the Junior Faculty Teaching Series, sponsored by the Center for Teaching and Vanderbilt Alumni Fund. The focus session was co-facilitated by Prof. Folgarait as invited senior faculty.

Even a lecture can be an opportunity to encourage critical thinking among students, as long as the teacher takes the time to be very intentional in planning the content, organization, and presentation on such a way as to promote an interactive experience.

Regarding content, I keep two things in mind: students need objective information, such as historical dates, but they also need a larger, conceptual framework to tie the facts together and produce meaning. This easier in the humanities, but it is possible in any discipline. Our concern should be that the objective information tell us something about the human condition: Why are science and math, for example, important to us?

I try to organize along a theme for the course as a whole and for every lecture, so that each class is self-contained and cohesive, and so that the lectures relate to each other in terms of overall theme, remembering that these are generalizations and need specific, concrete examples.

I try to involve students and create an interactive environment before asking questions that elicit both simple and complex responses. For example, a question seeking a simple answer would be, “What political system was overthrown by the French Revolution of 1789?” A more complex response would be generated by asking, “How do we, today, experience the results of that revolution?” I try to do this often, so that students are given a voice and feel empowered enough to risk thinking critically during a dynamic lecture experience.

HOME | ABOUT CFT | PROGRAMS | SERVICES | RESOURCES

Teaching Guides

  • Online Course Development Resources
  • Principles & Frameworks
  • Pedagogies & Strategies
  • Reflecting & Assessing
  • Challenges & Opportunities
  • Populations & Contexts

Quick Links

  • Services for Departments and Schools
  • Examples of Online Instructional Modules

Book cover

Education in the 21st Century pp 29–47 Cite as

Fostering Students’ Creativity and Critical Thinking in Science Education

  • Stéphan Vincent-Lancrin 6  
  • First Online: 31 January 2022

920 Accesses

1 Citations

What does it mean to redesign teaching and learning within existing science curricula (and learning objectives) so that students have more space and appropriate tasks to develop their creative and critical thinking skills? The chapter begins by describing the development of a portfolio of rubrics on creativity and critical thinking, including a conceptual rubric on science tested in primary and secondary education in 11 countries. Teachers in school networks adopted teaching and learning strategies aligned to the development of creativity and critical thinking, to these OECD rubrics. Examples of lesson plans and pedagogies that were developed are given, and some key challenges for teachers and learners are reflected on.

  • Critical thinking
  • Science education
  • Innovation in education
  • Lesson plans

The analyses given and the opinions expressed in this chapter are those of the author and do not necessarily reflect the views of the OECD and of its members.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Many project-based science units/ courses initially develop “Driving Questions” to contextualise the unit and give learners opportunities to connect the unit to their own experiences and prior ideas.

Adler, L., Bayer, I., Peek-Brown, D., Lee, J., & Krajcik, J. (2017). What controls my health . https://www.oecd.org/education/What-Controls-My-Health.pdf

Davies, M. (2015). In R. Barnett (Ed.), The Palgrave handbook of critical thinking in higher education . Palgrave Macmillan.

Chapter   Google Scholar  

Dennett, D. C. (2013). Intuition pumps and other tools for thinking . England: Penguin.

Google Scholar  

Ennis, R. (1996). Critical thinking . Upper Saddle River, NJ: Prentice-Hall.

Ennis, R. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4 .

Article   Google Scholar  

Facione, P.A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . Research findings and recommendations prepared for the Committee on Pre-College Philosophy of the American Philosophical Association. Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED315423

Feynman, R. (1963). The Feynman lectures on physics . (Volume I: The New Millennium Edition: Mainly Mechanics, Radiation, and Heat.). Basic Books.

Feynman, R. (1955). The value of science. In R. Leighton (Ed.), What do you care what other people think? Further adventures of a curious character (pp. 240–257). Penguin Books.

Fullan, M., Quinn, J., & McEachen, J. (2018). Deep learning: Engage the world, change the world . Corwin Press and Ontario Principals’ Council.

Guilford, J. P. (1950). Creativity. American Psychologist, 5 (9), 444–454. https://doi.org/10.1037/h0063487 .

Hitchcock, D. (2018). Critical thinking. In Zalta, E.N. (ed.), The Stanford encyclopedia of philosophy (Fall 2018 Edition). Retrieved from : https://plato.stanford.edu/archives/fall2018/entries/critical-thinking .

Kelley, T. (2001). The art of innovation: Lessons in creativity from IDEO . Currency: America’s leading design firm.

Lubart, T. (2000). Models of the creative process: Past, present and future. Creativity Research Journal, 13 (3–4), 295–308. https://doi.org/10.1207/S15326934CRJ1334_07 .

Lucas, B., Claxton, G., & Spencer, E. (2013). Progression in student creativity in school: First steps towards new forms of formative assessments. In OECD education working papers, 86 . Paris: OECD. https://doi.org/10.1787/5k4dp59msdwk-en .

Lucas, B., & Spencer, E. (2017). Teaching creative thinking: Developing learners who generate ideas and can think critically . England: Crown House Publishing.

McPeck, J. E. (1981). Critical thinking and education . New York: St. Martin’s.

Mednick, S. A. (1962). The associative basis of the creative process. Psychological Review, 69 (3), 220–232. https://doi.org/10.1037/h0048850 .

Newton, L. D., & Newton, D. P. (2014). Creativity in 21st century education. Prospects, 44 (4), 575–589. https://doi.org/10.1007/s11125-014-9322-1 .

Paddock, W., Erwin, S., Bielik, T., & Krajcik, J. (2019). Evaporative cooling . Retrieved from : https://www.oecd.org/education/Evaporative-Cooling.pdf

Rennie, L. (2020). Communicating certainty and uncertainty in science in out-of-school contexts. In D. Corrigan, C. Buntting, A. Jones, & A. Fitzgerald (Eds.), Values in science education: The shifting sands (pp. 7–30). Cham, Switzerland: Springer.

Runco, M. A. (2009). Critical thinking. In M. A. Runco & S. R. Pritzker (Eds.), Encyclopedia of creativity (pp. 449–452) . Academic.

Schneider, B., Krajcik, J., Lavonen, J., & Samela-Aro, K. (2020). Learning science: The value of crafting engagement in science environments . United States: Yale University.

Book   Google Scholar  

Sternberg, R. J., & Lubart, T. (1999). The concept of creativity: Prospects and paradigm. In R. J. Sternberg (Ed.), Handbook of creativity (pp. 3–14). England: Cambridge University.

Torrance, E. P. (1966). Torrance tests of creative thinking: Norms. Technical manual research edition; Verbal Tests, Forms A and B, Figural Tests, Forms A and B . Princeton, NJ: Personnel.

Torrance, E. P. (1970). Encouraging creativity in the classroom . United States: W.C. Brown.

Vincent-Lancrin, S., González-Sancho, C., Bouckaert, M., de Luca, F., Fernández-Barrerra, M., Jacotin, G., Urgel, J., & Vidal, Q. (2019). Fostering students’ creativity and critical thinking in education: What it means in school . Paris: OECD. https://doi.org/10.1787/62212c37-en .

Download references

Author information

Authors and affiliations.

Directorate for Education and Skills, Organisation for Economic Co-operation and Development, Paris, France

Stéphan Vincent-Lancrin

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Stéphan Vincent-Lancrin .

Editor information

Editors and affiliations.

Monash University, Clayton, VIC, Australia

Amanda Berry

University of Waikato, Hamilton, New Zealand

Cathy Buntting

Deborah Corrigan

Richard Gunstone

Alister Jones

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Cite this chapter.

Vincent-Lancrin, S. (2021). Fostering Students’ Creativity and Critical Thinking in Science Education. In: Berry, A., Buntting, C., Corrigan, D., Gunstone, R., Jones, A. (eds) Education in the 21st Century. Springer, Cham. https://doi.org/10.1007/978-3-030-85300-6_3

Download citation

DOI : https://doi.org/10.1007/978-3-030-85300-6_3

Published : 31 January 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-85299-3

Online ISBN : 978-3-030-85300-6

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

12k Accesses

9 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

fostering critical thinking

Fostering twenty-first century skills among primary school students through math project-based learning

Nadia Rehman, Wenlan Zhang, … Samia Batool

fostering critical thinking

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

Malavika E. Santhosh, Jolly Bhadra, … Noora Al-Thani

fostering critical thinking

A guide to critical thinking: implications for dental education

Deborah Martin

Introduction

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Exploring the effects of digital technology on deep learning: a meta-analysis.

Education and Information Technologies (2024)

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

fostering critical thinking

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education

* E-mail: [email protected]

Affiliation Northwest Association for Biomedical Research, Seattle, Washington, United States of America

Affiliation Center for Research and Learning, Snohomish, Washington, United States of America

  • Jeanne Ting Chowning, 
  • Joan Carlton Griswold, 
  • Dina N. Kovarik, 
  • Laura J. Collins

PLOS

  • Published: May 11, 2012
  • https://doi.org/10.1371/journal.pone.0036791
  • Reader Comments

Table 1

Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold student argumentation. In this study, we examined the effects of our teacher professional development and curricular materials on the ability of high school students to analyze a bioethical case study and develop a strong position. We focused on student ability to identify an ethical question, consider stakeholders and their values, incorporate relevant scientific facts and content, address ethical principles, and consider the strengths and weaknesses of alternate solutions. 431 students and 12 teachers participated in a research study using teacher cohorts for comparison purposes. The first cohort received professional development and used the curriculum with their students; the second did not receive professional development until after their participation in the study and did not use the curriculum. In order to assess the acquisition of higher-order justification skills, students were asked to analyze a case study and develop a well-reasoned written position. We evaluated statements using a scoring rubric and found highly significant differences (p<0.001) between students exposed to the curriculum strategies and those who were not. Students also showed highly significant gains (p<0.001) in self-reported interest in science content, ability to analyze socio-scientific issues, awareness of ethical issues, ability to listen to and discuss viewpoints different from their own, and understanding of the relationship between science and society. Our results demonstrate that incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content, while promoting reasoning and justification skills that help prepare an informed citizenry.

Citation: Chowning JT, Griswold JC, Kovarik DN, Collins LJ (2012) Fostering Critical Thinking, Reasoning, and Argumentation Skills through Bioethics Education. PLoS ONE 7(5): e36791. https://doi.org/10.1371/journal.pone.0036791

Editor: Julio Francisco Turrens, University of South Alabama, United States of America

Received: February 7, 2012; Accepted: April 13, 2012; Published: May 11, 2012

Copyright: © 2012 Chowning et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: The “Collaborations to Understand Research and Ethics” (CURE) program was supported by a Science Education Partnership Award grant ( http://ncrrsepa.org ) from the National Center for Research Resources and the Division of Program Coordination, Planning, and Strategic Initiatives of the National Institutes of Health through Grant Number R25OD011138. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

Introduction

While the practice of argumentation is a cornerstone of the scientific process, students at the secondary level have few opportunities to engage in it [1] . Recent research suggests that collaborative discourse and critical dialogue focused on student claims and justifications can increase student reasoning abilities and conceptual understanding, and that strategies are needed to promote such practices in secondary science classrooms [2] . In particular, students need structured opportunities to develop arguments and discuss them with their peers. In scientific argument, the data, claims and warrants (that relate claims to data) are strictly concerned with scientific data; in a socio-scientific argument, students must consider stakeholder perspectives and ethical principles and ideas, in addition to relevant scientific background. Regardless of whether the arguments that students employ point towards scientific or socio-scientific issues, the overall processes students use in order to develop justifications rely on a model that conceptualizes arguments as claims to knowledge [3] .

Prior research in informal student reasoning and socio-scientific issues also indicates that most learners are not able to formulate high-quality arguments (as defined by the ability to articulate justifications for claims and to rebut contrary positions), and highlights the challenges related to promoting argumentation skills. Research suggests that students need experience and practice justifying their claims, recognizing and addressing counter-arguments, and learning about elements that contribute to a strong justification [4] , [5] .

Proponents of Socio-scientific Issues (SSI) education stress that the intellectual development of students in ethical reasoning is necessary to promote understanding of the relationship between science and society [4] , [6] . The SSI approach emphasizes three important principles: (a) because science literacy should be a goal for all students, science education should be broad-based and geared beyond imparting relevant content knowledge to future scientists; (b) science learning should involve students in thinking about the kinds of real-world experiences that they might encounter in their lives; and (c) when teaching about real-world issues, science teachers should aim to include contextual elements that are beyond traditional science content. Sadler and Zeidler, who advocate a SSI perspective, note that “people do not live their lives according to disciplinary boundaries, and students approach socio-scientific issues with diverse perspectives that integrate science and other considerations” [7] .

Standards for science literacy emphasize not only the importance of scientific content and processes, but also the need for students to learn about science that is contextualized in real-world situations that involve personal and community decision-making [7] – [10] . The National Board for Professional Teaching Standards stresses that students need “regular exposure to the human contexts of science [and] examples of ethical dilemmas, both current and past, that surround particular scientific activities, discoveries, and technologies” [11] . Teachers are mandated by national science standards and professional teaching standards to address the social dimensions of science, and are encouraged to provide students with the tools necessary to engage in analyzing bioethical issues; yet they rarely receive training in methods to foster such discussions with students.

The Northwest Association for Biomedical Research (NWABR), a non-profit organization that advances the understanding and support of biomedical research, has been engaging students and teachers in bringing the discussion of ethical issues in science into the classroom since 2000 [12] . The mission of NWABR is to promote an understanding of biomedical research and its ethical conduct through dialogue and education. The sixty research institutions that constitute our members include academia, industry, non-profit research organizations, research hospitals, professional societies, and volunteer health organizations. NWABR connects the scientific and education communities across the Northwestern United States and helps the public understand the vital role of research in promoting better health outcomes. We have focused on providing teachers with both resources to foster student reasoning skills (such as activities in which students practice evaluating arguments using criteria for strong justifications), as well as pedagogical strategies for fostering collaborative discussion [13] – [15] . Our work draws upon socio-scientific elements of functional scientific literacy identified by Zeidler et al. [6] . We include support for teachers in discourse issues, nature of science issues, case-based issues, and cultural issues – which all contribute to cognitive and moral development and promote functional scientific literacy. Our Collaborations to Understand Research and Ethics (CURE) program, funded by a Science Education Partnership Award from the National Institutes of Health (NIH), promotes understanding of translational biomedical research as well as the ethical considerations such research raises.

Many teachers find a principles-based approach most manageable for introducing ethical considerations. The principles include respect for persons (respecting the inherent worth of an individual and his or her autonomy), beneficence/nonmaleficence (maximizing benefits/minimizing harms), and justice (distributing benefits/burdens equitably across a group of individuals). These principles, which are articulated in the Belmont Report [16] in relation to research with human participants (and which are clarified and defended by Beauchamp and Childress [17] ), represent familiar concepts and are widely used. In our professional development workshops and in our support resources, we also introduce teachers to care, feminist, virtue, deontological and consequentialist ethics. Once teachers become familiar with principles, they often augment their teaching by incorporating these additional ethical approaches.

The Bioethics 101 materials that were the focus of our study were developed in conjunction with teachers, ethicists, and scientists. The curriculum contains a series of five classroom lessons and a culminating assessment [18] and is described in more detail in the Program Description below. For many years, teachers have shared with us the dramatic impacts that the teaching of bioethics can have on their students; this research study was designed to investigate the relationship between explicit instruction in bioethical reasoning and resulting student outcomes. In this study, teacher cohorts and student pre/post tests were used to investigate whether CURE professional development and the Bioethics 101 curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Our research strongly indicates that such reasoning approaches can be taught to high school students and can significantly improve their ability to develop well-reasoned justifications to bioethical dilemmas. In addition, student self-reports provide additional evidence of the extent to which bioethics instruction impacted their attitudes and perceptions and increased student motivation and engagement with science content.

Program Description

Our professional development program, Ethics in the Science Classroom, spanned two weeks. The first week, a residential program at the University of Washington (UW) Pack Forest Conference Center, focused on our Bioethics 101 curriculum, which is summarized in Table S1 and is freely available at http://www.nwabr.org . The curriculum, a series of five classroom lessons and a culminating assessment, was implemented by all teachers who were part of our CURE treatment group. The lessons explore the following topics: (a) characteristics of an ethical question; (b) bioethical principles; (c) the relationship between science and ethics and the roles of objectivity/subjectivity and evidence in each; (d) analysis of a case study (including identifying an ethical question, determining relevant facts, identifying stakeholders and their concerns and values, and evaluating options); and (e) development of a well-reasoned justification for a position.

Additionally, the first week focused on effective teaching methods for incorporating ethical issues into science classrooms. We shared specific pedagogical strategies for helping teachers manage classroom discussion, such as asking students to consider the concerns and values of individuals involved in the case while in small single and mixed stakeholder groups. We also provided participants with background knowledge in biomedical research and ethics. Presentations from colleagues affiliated with the NIH Clinical and Translational Science Award program, from the Department of Bioethics and Humanities at the UW, and from NWABR member institutions helped participants develop a broad appreciation for the process of biomedical research and the ethical issues that arise as a consequence of that research. Topics included clinical trials, animal models of disease, regulation of research, and ethical foundations of research. Participants also developed materials directly relevant and applicable to their own classrooms, and shared them with other educators. Teachers wrote case studies and then used ethical frameworks to analyze the main arguments surrounding the case, thereby gaining experience in bioethical analysis. Teachers also developed Action Plans to outline their plans for implementation.

The second week provided teachers with first-hand experiences in NWABR research institutions. Teachers visited research centers such as the Tumor Vaccine Group and Clinical Research Center at the UW. They also had the opportunity to visit several of the following institutions: Amgen, Benaroya Research Institute, Fred Hutchinson Cancer Research Center, Infectious Disease Research Institute, Institute for Stem Cells and Regenerative Medicine at the UW, Pacific Northwest Diabetes Research Institute, Puget Sound Blood Center, HIV Vaccine Trials Network, and Washington National Primate Research Center. Teachers found these experiences in research facilities extremely valuable in helping make concrete the concepts and processes detailed in the first week of the program.

We held two follow-up sessions during the school year to deepen our relationship with the teachers, promote a vibrant ethics in science education community, provide additional resources and support, and reflect on challenges in implementation of our materials. We also provided the opportunity for teachers to share their experiences with one another and to report on the most meaningful longer-term impacts from the program. Another feature of our CURE program was the school-year Institutional Review Board (IRB) and Institutional Animal Care and Use Committee (IACUC) follow-up sessions. Teachers chose to attend one of NWABR’s IRB or IACUC conferences, attend a meeting of a review board, or complete NIH online ethics training. Some teachers also visited the UW Embryonic Stem Cell Research Oversight Committee. CURE funding provided substitutes in order for teachers to be released during the workday. These opportunities further engaged teachers in understanding and appreciating the actual process of oversight for federally funded research.

Participants

Most of the educators who have been through our intensive summer workshops teach secondary level science, but we have welcomed teachers at the college, community college, and even elementary levels. Our participants are primarily biology teachers; however, chemistry and physical science educators, health and career specialists, and social studies teachers have also used our strategies and materials with success.

The research design used teacher cohorts for comparison purposes and recruited teachers who expressed interest in participating in a CURE workshop in either the summer of 2009 or the summer of 2010. We assumed that all teachers who applied to the CURE workshop for either year would be similarly interested in ethics topics. Thus, Cohort 1 included teachers participating in CURE during the summer of 2009 (the treatment group). Their students received CURE instruction during the following 2009–2010 academic year. Cohort 2 (the comparison group) included teachers who were selected to participate in CURE during the summer of 2010. Their students received a semester of traditional classroom instruction in science during the 2009–2010 academic year. In order to track participation of different demographic groups, questions pertaining to race, ethnicity, and gender were also included in the post-tests.

Using an online sample size calculator http://www.surveysystem.com/sscalc.htm , a 95% Confidence Level, and a Confidence Interval of 5, it was calculated that a sample size of 278 students would be needed for the research study. For that reason, six Cohort 1 teachers were impartially chosen to be in the study. For the comparison group, the study design also required six teachers from Cohort 2. The external evaluator contacted all Cohort 2 teachers to explain the research study and obtain their consent, and successfully recruited six to participate.

Ethics Statement

This study was conducted according to the principles expressed in the Declaration of Helsinki. Prior to the study, research processes and materials were reviewed and approved by the Western Institutional Review Board (WIRB Study #1103180). CURE staff and evaluators received written permission from parents to have their minor children participate in the Bioethics 101 curriculum, for the collection and subsequent analysis of students’ written responses to the assessment, and for permission to collect and analyze student interview responses. Teachers also provided written informed consent prior to study participation. All study participants and/or their legal guardians provided written informed consent for the collection and subsequent analysis of verbal and written responses.

Research Study

Analyzing a case study: cure and comparison students..

Teacher cohorts and pre/post tests were used to investigate whether CURE professional development and curriculum materials made a significant difference in high school students’ abilities to analyze a case study and justify their positions. Cohort 1 teachers (N = 6) received CURE professional development and used the Bioethics 101 curriculum with their students (N = 323); Cohort 2 teachers (N = 6) did not receive professional development until after their participation in the study and did not use the curriculum with their students (N = 108). Cohort 2 students were given the test case study and questions, but with only traditional science instruction during the semester. Each Cohort was further divided into two groups (A and B). Students in Group A were asked to complete a pre-test prior to the case study, while students in Group B did not. All four student groups completed a post-test after analysis of the case study. This four-group model ( Table 1 ) allowed us to assess: 1) the effect of CURE treatment relative to conventional education practices, 2) the effect of the pre-test relative to no pre-test, and 3) the interaction between the pre-test and CURE treatment condition. Random assignment of students to treatment and comparison groups was not possible; consequently we used existing intact classes. In all, 431 students and 12 teachers participated in the research study ( Table 2 ).

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0036791.t001

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t002

In order to assess the acquisition of higher-order justification skills, students used the summative assessment provided in our curriculum as the pre- and post-test. We designed the curriculum to scaffold students’ ability to write a persuasive bioethical position; by the time they participated in the assessment, Cohort 1 students had opportunities to discuss the elements of a strong justification as well as practice in analyzing case studies. For our research, both Cohort 1 and 2 students were asked to analyze the case study of “Ashley X” ( Table S2 ), a young girl with a severe neurological impairment whose parents wished to limit her growth through a combination of interventions so that they could better care for her. Students were asked to respond to the ethical question: “Should one or more medical interventions be used to limit Ashley’s growth and physical maturation? If so, which interventions should be used and why?” In their answer, students were encouraged to develop a well-reasoned written position by responding to five questions that reflected elements of a strong justification. One difficulty in evaluating a multifaceted science-related learning task (analyzing a bioethical case study and justifying a position) is that a traditional multiple-choice assessment may not adequately reflect the subtlety and depth of student understanding. We used a rubric to assess student responses to each of the following questions (Q) on a scale of 1 to 4; these questions represent key elements of a strong justification for a bioethical argument:

  • Q1: Student Position: What is your decision?
  • Q2: Factual Support: What facts support your decision? Is there missing information that could be used to make a better decision?
  • Q3: Interests and Views of Others: Who will be impacted by the decision and how will they be impacted?
  • Q4: Ethical Considerations: What are the main ethical considerations?
  • Q5: Evaluating Alternative Options: What are some strengths and weaknesses of alternate solutions?

In keeping with our focus on the process of reasoning rather than on having students draw any particular conclusion, we did not assess students on which position they took, but on how well they stated and justified the position they chose.

We used a rubric scoring guide to assess student learning, which aligned with the complex cognitive challenges posed by the task ( Table S3 ). Assessing complex aspects of student learning is often difficult, especially evaluating how students represent their knowledge and competence in the domain of bioethical reasoning. Using a scoring rubric helped us more authentically score dimensions of students’ learning and their depth of thinking. An outside scorer who had previously participated in CURE workshops, has secondary science teaching experience, and who has a Masters degree in Bioethics blindly scored all student pre- and post-tests. Development of the rubric was an iterative process, refined after analyzing a subset of surveys. Once finalized, we confirmed the consistency and reliability of the rubric and grading process by re-testing a subset of student surveys randomly selected from all participating classes. The Cronbach alpha reliability result was 0.80 [19] .

The rubric closely followed the framework introduced through the curricular materials and reinforced through other case study analyses. For example, under Q2, Factual Support , a student rated 4 out of 4 if their response demonstrated the following:

  • The justification uses the relevant scientific reasons to support student’s answer to the ethical question.
  • The student demonstrates a solid understanding of the context in which the case occurs, including a thoughtful description of important missing information.
  • The student shows logical, organized thinking. Both facts supporting the decision and missing information are presented at levels exceeding standard (as described above).

An example of a student response that received the highest rating for Q2 asking for factual support is: “Her family has a history of breast cancer and fibrocystic breast disease. She is bed-bound and completely dependent on her parents. Since she is bed-bound, she has a higher risk of blood clots. She has the mentality of an infant. Her parents’ requests offer minimal side effects. With this disease, how long is she expected to live? If not very long then her parents don’t have to worry about growth. Are there alternative measures?”

In contrast, a student rated a 1 for responses that had the following characteristics:

  • Factual information relevant to the case is incompletely described or is missing.
  • Irrelevant information may be included and the student demonstrates some confusion.

An example of a student response that rated a 1 for Q2 is: “She is unconscious and doesn’t care what happens.”

All data were entered into SPSS (Statistical Package for the Social Sciences) and analyzed for means, standard deviations, and statistically significant differences. An Analysis of Variance (ANOVA) was used to test for significant overall differences between the two cohort groups. Pre-test and post-test composite scores were calculated for each student by adding individual scores for each item on the pre- and post-tests. The composite score on the post-test was identical in form and scoring to the composite score on the pre-test. The effect of the CURE treatment on post-test composite scores is referred to as the Main Effect, and was determined by comparing the post-test composite scores of the Cohort 1 (CURE) and Cohort 2 (Comparison) groups. In addition, Cohort 1 and Cohort 2 means scores for each test question (Questions 1–5) were compared within and between cohorts using t-tests.

CURE student perceptions of curriculum effect.

During prior program evaluations, we asked teachers to identify what they believed to be the main impacts of bioethics instruction on students. From this earlier work, we identified several themes. These themes, listed below, were further tested in our current study by asking students in the treatment group to assess themselves in these five areas after participation in the lesson, using a retrospective pre-test design to measure self-reported changes in perceptions and abilities [20] .

  • Interest in the science content of class (before/after) participating in the Ethics unit.
  • Ability to analyze issues related to science and society and make well-justified decisions (before/after) participating in the Ethics unit.
  • Awareness of ethics and ethical issues (before/after) participating in the Ethics unit.
  • Understanding of the connection between science and society (before/after) participating in the Ethics unit.
  • Ability to listen to and discuss different viewpoints (before/after) participating in the Ethics unit.

After Cohort 1 (CURE) students participated in the Bioethics 101 curriculum, we asked them to indicate the extent to which they had changed in each of the theme areas we had identified using Likert-scale items on a retrospective pre-test design [21] , with 1 =  None and 5 =  A lot!. We used paired t-tests to examine self-reported changes in their perceptions and abilities. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] .

Student Demographics

Demographic information is provided in Table 3 . Of those students who reported their gender, a larger number were female (N = 258) than male (N = 169), 60% and 40%, respectively, though female students represented a larger proportion of Cohort 1 than Cohort 2. Students ranged in age from 14 to 18 years old; the average age of the students in both cohorts was 15. Students were enrolled in a variety of science classes (mostly Biology or Honors Biology). Because NIH recognizes a difference between race and ethnicity, students were asked to respond to both demographic questions. Students in both cohorts were from a variety of ethnic and racial backgrounds.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t003

Pre- and Post-Test Results for CURE and Comparison Students

Post-test composite means for each cohort (1 and 2) and group (A and B) are shown in Table 4 . Students receiving CURE instruction earned significantly higher (p<0.001) composite mean scores than students in comparison classrooms. Cohort 1 (CURE) students (N = 323) post-test composite means were 10.73, while Cohort 2 (Comparison) students (N = 108) had post-test composite means of 9.16. The ANOVA results ( Table 5 ) showed significant differences in the ability to craft strong justifications between Cohort 1 (CURE) and Cohort 2 (Comparison) students F (1, 429) = 26.64, p<0.001.

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t004

thumbnail

https://doi.org/10.1371/journal.pone.0036791.t005

We also examined if the pre-test had a priming effect on the students’ scores because it provides an opportunity to practice or think about the content. The pre-test would not have this effect on the comparison group because they were not exposed to CURE teaching or materials. If the pre-test provides a practice or priming effect, this would result in higher post-test performance by CURE students receiving the pre-test than by CURE students not receiving the pre-test. For this comparison, the F (1, 321) = 0.10, p = 0.92. This result suggests that the differences between the CURE and comparison groups are attributable to the treatment condition and not a priming effect of the pre-test.

After differences in main effects were investigated, we analyzed differences between and within cohorts on individual items (Questions 1–5) using t-tests. The Mean scores of individual questions for each cohort are shown in Figure 1 . There were no significant differences between Cohort 1 (CURE) and Cohort 2 (Comparison) on pre-test scores. In fact, for Q5, the mean pre-test scores for the Cohort 2 (Comparison) group were slightly higher (1.8) than the Cohort 1 (CURE) group (1.6). On the post-test, the Cohort 1 (CURE) students significantly outscored the Cohort 2 (Comparison) students on all questions; Q1, Q3, and Q4 were significant at p<0.001, Q2 was significant at p<0.01, and Q5 was significant at p<0.05. The largest post-test difference between Cohort 1 (CURE) students and Cohort 2 (Comparison) students was for Q3, with an increase of 0.6; all the other questions showed changes of 0.3 or less. Comparing Cohort 1 (CURE) post-test performance on individual questions yields the following results: scores were highest for Q1 (mean = 2.8), followed by Q3 (mean = 2.2), Q2 (mean = 2.1), and Q5 (mean = 1.9). Lowest Cohort 1 (CURE) post-test scores were associated with Q4 (mean = 1.8).

thumbnail

Mean scores for individual items of the pre-test for each cohort revealed no differences between groups for any of the items (Cohort 1, CURE, N = 323; Cohort 2, Comparison, N = 108). Post-test gains of Cohort 1 (CURE) relative to Cohort 2 (Comparison) were statistically significant for all questions. (Question (Q) 1) What is your decision? (Q2) What facts support your decision? Is there missing information that could be used to make a better decision? (Q3) Who will be impacted by the decision and how will they be impacted? (Q4) What are the main ethical considerations? and (Q5)What are some strengths and weaknesses of alternate solutions? Specifically: (Q1), (Q3), (Q4) were significant at p<0.001 (***); (Q2) was significant at p<0.01 (**); and (Q5) was significant at p<0.05 (*). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g001

Overall, across all four groups, mean scores for Q1 were highest (2.6), while scores for Q4 were lowest (1.6). When comparing within-Cohort scores on the pre-test versus post-test, Cohort 2 (Comparison Group) showed little to no change, while CURE students improved on all test questions.

CURE Student Perceptions of Curriculum Effect

After using our resources, Cohort 1 (CURE) students showed highly significant gains (p<0.001) in all areas examined: interest in science content, ability to analyze socio-scientific issues and make well-justified decisions, awareness of ethical issues, understanding of the connection between science and society, and the ability to listen to and discuss viewpoints different from their own ( Figure 2 ). Overall, students gave the highest score to their ability to listen to and discuss viewpoints different than their own after participating in the CURE unit (mean = 4.2). Also highly rated were the changes in understanding of the connection between science and society (mean = 4.1) and the awareness of ethical issues (mean = 4.1); these two perceptions also showed the largest change pre-post (from 2.8 to 4.1 and 2.7 to 4.1, respectively).

thumbnail

Mean scores for individual items of the retrospective items on the post-test for Cohort 1 students revealed significant gains (p<0.001) in all self-reported items: Interest in science (N = 308), ability to Analyze issues related to science and society and make well-justified decisions (N = 306), Awareness of ethics and ethical issues (N = 309), Understanding of the connection between science and society (N = 308), and the ability to Listen and discuss different viewpoints (N = 308). Lines represent standard deviations.

https://doi.org/10.1371/journal.pone.0036791.g002

NWABR’s teaching materials provide support both for general ethics and bioethics education, as well as for specific topics such as embryonic stem cell research. These resources were developed to provide teachers with classroom strategies, ethics background, and decision-making frameworks. Teachers are then prepared to share their understanding with their students, and to support their students in using analysis tools and participating in effective classroom discussions. Our current research grew out of a desire to measure the effectiveness of our professional development and teaching resources in fostering student ability to analyze a complex bioethical case study and to justify their positions.

Consistent with the findings of SSI researchers and our own prior anecdotal observations of teacher classrooms and student work, we found that students improve in their analytical skill when provided with reasoning frameworks and background in concepts such as beneficence, respect, and justice. Our research demonstrates that structured reasoning approaches can be effectively taught at the secondary level and that they can improve student thinking skills. After teachers participated in a two-week professional development workshop and utilized our Bioethics 101 curriculum, within a relatively short time period (five lessons spanning approximately one to two weeks), students grew significantly in their ability to analyze a complex case and justify their position compared to students not exposed to the program. Often, biology texts present a controversial issue and ask students to “justify their position,” but teachers have shared with us that students frequently do not understand what makes a position or argument well-justified. By providing students with opportunities to evaluate sample justifications, and by explicitly introducing a set of elements that students should include in their justifications, we have facilitated the development of this important cognitive skill.

The first part of our research examined the impact of CURE instruction on students’ ability to analyze a case study. Although students grew significantly in all areas, the highest scores for the Cohort 1 (CURE) students were found in response to Q1 of the case analysis, which asked them to clearly state their own position, and represented a relatively easy cognitive task. This question also received the highest score in the comparison group. Not surprisingly, students struggled most with Q4 and Q5, which asked for the ethical considerations and the strengths and weaknesses of different solutions, respectively, and which tested specialized knowledge and sophisticated analytical skills. The area in which we saw the most growth in Cohort 1 (CURE) (both in comparison to the pre-test and in relation to the comparison group) was in students’ ability to identify stakeholders in a case and state how they might be impacted by a decision (Q3). Teachers have shared with us that secondary students are often focused on their own needs and perspectives; stepping into the perspectives of others helps enlarge their understanding of the many views that can be brought to bear upon a socio-scientific issue.

Many of our teachers go far beyond these introductory lessons, revisiting key concepts throughout the year as new topics are presented in the media or as new curricular connections arise. Although we have observed this phenomenon for many years, it has been difficult to evaluate these types of interventions, as so many teachers implement the concepts and ideas differently in response to their unique needs. Some teachers have used the Bioethics 101 curriculum as a means for setting the tone and norms for the entire year in their classes and fostering an atmosphere of respectful discussion. These teachers note that the “opportunity cost” of investing time in teaching basic bioethical concepts, decision-making strategies, and justification frameworks pays off over the long run. Students’ understanding of many different science topics is enhanced by their ability to analyze issues related to science and society and make well-justified decisions. Throughout their courses, teachers are able to refer back to the core ideas introduced in Bioethics 101, reinforcing the wide utility of the curriculum.

The second part of our research focused on changes in students’ self-reported attitudes and perceptions as a result of CURE instruction. Obtaining accurate and meaningful data to assess student self-reported perceptions can be difficult, especially when a program is distributed across multiple schools. The traditional use of the pretest-posttest design assumes that students are using the same internal standard to judge attitudes or perceptions. Considerable empirical evidence suggests that program effects based on pre-posttest self-reports are masked because people either overestimate or underestimate their pre-program perceptions [20] , [22] – [26] . Moore and Tananis [27] report that response shift can occur in educational programs, especially when they are designed to increase students’ awareness of a specific construct that is being measured. The retrospective pre-test design (RPT), which was used in this study, has gained increasing prominence as a convenient and valid method for measuring self-reported change. RPT has been shown to reduce response shift bias, providing more accurate assessment of actual effect. The retrospective design avoids response-shift bias that results from overestimation or underestimation of change since both before and after information is collected at the same time [20] . It is also convenient to implement, provides comparison data, and may be more appropriate in some situations [26] . Using student self-reported measures concerning perceptions and attitudes is also a meta-cognitive strategy that allows students to think about their learning and justify where they believe they are at the end of a project or curriculum compared to where they were at the beginning.

Our approach resulted in a significant increase in students’ own perceived growth in several areas related to awareness, understanding, and interest in science. Our finding that student interest in science can be significantly increased through a case-study based bioethics curriculum has implications for instruction. Incorporating ethical dilemmas into the classroom is one strategy for increasing student motivation and engagement with science content. Students noted the greatest changes in their own awareness of ethical issues and in understanding the connection between science and society. Students gave the highest overall rating to their ability to listen to and discuss viewpoints different from their own after participation in the bioethics unit. This finding also has implications for our future citizenry; in an increasingly diverse and globalized society, students need to be able to engage in civil and rational dialogue with others who may not share their views.

Conducting research studies about ethical learning in secondary schools is challenging; recruiting teachers for Cohort 2 and obtaining consent from students, parents, and teachers for participation was particularly difficult, and many teachers faced restraints from district regulations about curriculum content. Additional studies are needed to clarify the extent to which our curricular materials alone, without accompanying teacher professional development, can improve student reasoning skills.

Teacher pre-service training programs rarely incorporate discussion of how to address ethical issues in science with prospective educators. Likewise, with some noticeable exceptions, such as the work of the University of Pennsylvania High School Bioethics Project, the Genetic Science Learning Center at the University of Utah, and the Kennedy Institute of Ethics at Georgetown University, relatively few resources exist for high school curricular materials in this area. Teachers have shared with us that they know that such issues are important and engaging for students, but they do not have the experience in either ethical theory or in managing classroom discussion to feel comfortable teaching bioethics topics. After participating in our workshops or using our teaching materials, teachers shared that they are better prepared to address such issues with their students, and that students are more engaged in science topics and are better able to see the real-world context of what they are learning.

Preparing students for a future in which they have access to personalized genetic information, or need to vote on proposals for stem cell research funding, necessitates providing them with the tools required to reason through a complex decision containing both scientific and ethical components. Students begin to realize that, although there may not be an absolute “right” or “wrong” decision to be made on an ethical issue, neither is ethics purely relative (“my opinion versus yours”). They come to realize that all arguments are not equal; there are stronger and weaker justifications for positions. Strong justifications are built upon accurate scientific information and solid analysis of ethical and contextual considerations. An informed citizenry that can engage in reasoned dialogue about the role science should play in society is critical to ensure the continued vitality of the scientific enterprise.

“I now bring up ethical issues regularly with my students, and use them to help students see how the concepts they are learning apply to their lives…I am seeing positive results from my students, who are more clearly able to see how abstract science concepts apply to them.” – CURE Teacher “In ethics, I’ve learned to start thinking about the bigger picture. Before, I based my decisions on how they would affect me. Also, I made decisions depending on my personal opinions, sometimes ignoring the facts and just going with what I thought was best. Now, I know that to make an important choice, you have to consider the other people involved, not just yourself, and take all information and facts into account.” – CURE Student

Supporting Information

Bioethics 101 Lesson Overview.

https://doi.org/10.1371/journal.pone.0036791.s001

Case Study for Assessment.

https://doi.org/10.1371/journal.pone.0036791.s002

Grading Rubric for Pre- and Post-Test: Ashley’s Case.

https://doi.org/10.1371/journal.pone.0036791.s003

Acknowledgments

We thank Susan Adler, Jennifer M. Pang, Ph.D., Leena Pranikay, and Reitha Weeks, Ph.D., for their review of the manuscript, and Nichole Beddes for her assistance scoring student work. We also thank Carolyn Cohen of Cohen Research and Evaluation, former CURE Evaluation Consultant, who laid some of the groundwork for this study through her prior work with us. We also wish to thank the reviewers of our manuscript for their thoughtful feedback and suggestions.

Author Contributions

Conceived and designed the experiments: JTC LJC. Performed the experiments: LJC. Analyzed the data: LJC JTC DNK. Contributed reagents/materials/analysis tools: JCG. Wrote the paper: JTC LJC DNK JCG. Served as Principal Investigator on the CURE project: JTC. Provided overall program leadership: JTC. Led the curriculum and professional development efforts: JTC JCG. Raised funds for the CURE program: JTC.

  • 1. Bell P (2004) Promoting students’ argument construction and collaborative debate in the science classroom. Mahwah, NJ: Erlbaum.
  • View Article
  • Google Scholar
  • 3. Toulmin S (1958) The Uses of Argument. Cambridge: Cambridge University Press.
  • 6. Zeidler DL, Sadler TD, Simmons ML, Howes , EV (2005) Beyond STS: A research-based framework for socioscientific issues education. Wiley InterScience. pp. 357–377.
  • 8. AAAS (1990) Science for All Americans. New York: Oxford University Press.
  • 9. National Research Council (1996) National Science Education Standards. Washington, DC: National Academies Press.
  • 10. National Research Council (2011) A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: National Academies Press.
  • 11. National Board for Professional Teaching Standards (2007) Adolescence and Young Adulthood Science Standards. Arlington, VA.
  • 17. Beauchamp T, Childress JF (2001) Principles of biomedical ethics. New York: Oxford University Press.
  • 18. Chowning JT, Griswold JC (2010) Bioethics 101. Seattle, WA: NWABR.
  • 26. Klatt J, Taylor-Powell E (2005) Synthesis of literature relative to the retrospective pretest design. Presentation to the 2005 Joint CES/AEA Conference, Toronto.

Skills for Life: Fostering Critical Thinking

fostering critical thinking

New TTE Logo very Small

Teach the Earth the portal for Earth Education

From NAGT's On the Cutting Edge Collection

NAGT Join small

  • Course Topics
  • Atmospheric Science
  • Biogeoscience
  • Environmental Geology
  • Environmental Science
  • Geochemistry
  • Geomorphology
  • GIS/Remote Sensing
  • Hydrology/Hydrogeology
  • Oceanography
  • Paleontology
  • Planetary Science
  • Sedimentary Geology
  • Structural Geology
  • Incorporating Societal Issues
  • Climate Change
  • Complex Systems
  • Ethics and Environmental Justice
  • Geology and Health
  • Public Policy
  • Sustainability
  • Strengthening Your Department
  • Career Development
  • Strengthening Departments
  • Student Recruitment
  • Teacher Preparation
  • Teaching Topics
  • Biocomplexity
  • Early Earth
  • Earthquakes
  • Hydraulic Fracturing
  • Plate Tectonics
  • Teaching Environments
  • Intro Geoscience
  • Online Teaching
  • Teaching in the Field
  • Two-Year Colleges
  • Urban Students
  • Enhancing your Teaching
  • Affective Domain
  • Course Design
  • Data, Simulations, Models
  • Geophotography
  • Google Earth
  • Metacognition
  • Online Games
  • Problem Solving
  • Quantitative Skills
  • Rates and Time
  • Service Learning
  • Spatial Thinking
  • Teaching Methods
  • Teaching with Video
  • Undergrad Research
  • Visualization
  • Teaching Materials
  • Two Year Colleges
  • Departments
  • Workshops and Webinars

' crossorigin=

Designing Principles for Creating Effective Web-Based Learning Resources in the Geosciences Topical Resources

  • ⋮⋮⋮ ×

Critical Thinking and Inquiry

Education research has demonstrated what great educators have always known: students acquire and retain knowledge most effectively when they must understand new information well enough to apply it to new situations, or to reformulate it into new ideas and knowledge ( NRC, 2000 ). Fostering critical thinking skills is becoming one of the chief goals of education, particularly at the college level, where a variety of pedagogic techniques are being used to develop critical thinking skills in students (Irenton, Manduca and Mogk, 1996).

These skills are often developed in geoscience students through project-based learning, in laboratory or field settings. In single laboratories or longer-term projects, students are asked to acquire basic knowledge and understanding, review literature, develop research skills, gather, analyze, and evaluate data, and ultimately to synthesize this complex information into an advanced understanding.

It is important that web resources developed for educational use reinforce this kind of learning. Many resources simply display information on pages with the only student interaction being a click on the "next" or "previous" buttons. Even if these pages are nested in cutting edge technology, these experiences are analogous to lectures which have been shown to fail in teaching advanced thinking skills ( Schank et al., 1995 ). Schank et al. posit four fundamental features of natural human learning: Learning is goal-directed , learning is failure-driven , learning is case-based , and learning best occurs by doing .The authors also give examples of what it means to use these fundamentals in designing meultimedia or web resources.

Critical thinking is a complex idea. It can mean many different things to different people depending on their point of view. So what do we mean by "critical thinking" and "inquiry"? The next several pages include opportunities for you to explore your own definitions, what research on the subject has to say and an example course that embodies some of the principles we are talking about.

« Previous Page       Next Page »

IMAGES

  1. Critical Thinking Skills for Kids

    fostering critical thinking

  2. Fostering Critical Thinking Skills

    fostering critical thinking

  3. Critical Thinking Definition, Skills, and Examples

    fostering critical thinking

  4. 10 Essential Critical Thinking Skills (And How to Improve Them

    fostering critical thinking

  5. Thoughts on fostering critical thinking skills in kids

    fostering critical thinking

  6. Free NJ Foster Adoptive & Kinship Parent Courses

    fostering critical thinking

VIDEO

  1. 3 Strategies for Engaging Criminal Justice Students in Critical Thinking

  2. How to Foster Critical Thinking and Problem Solving Skills in the Classroom

  3. Foundations of Critical Thinking

  4. Interactive Learning for the 21st Century: Empowering Students for Success

  5. Empower Your Classroom: The Power of Student-Led Learning

  6. Uncivil Wars

COMMENTS

  1. Eight Instructional Strategies for Promoting Critical Thinking

    Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care ...

  2. Fostering and assessing student critical thinking: From theory to

    As part of its project on Fostering creativity and critical thinking in education, the OECD worked with teachers and school networks in eleven countries to develop rubrics and other supporting materials through a quick prototyping process (Vincent-Lancrin et al., 2019).

  3. The Socratic Method: Fostering Critical Thinking

    This teaching tip explores how the Socratic Method can be used to promote critical thinking in classroom discussions. It is based on the article, The Socratic Method: What it is and How to Use it in the Classroom, published in the newsletter, Speaking of Teaching, a publication of the Stanford Center for Teaching and Learning (CTL).

  4. Full article: Fostering critical thinking skills in secondary education

    Our critical thinking skills framework. The focus on critical thinking skills has its roots in two approaches: the cognitive psychological approach and the educational approach (see for reviews, e.g. Sternberg Citation 1986; Ten Dam and Volman Citation 2004).From a cognitive psychological approach, critical thinking is defined by the types of behaviours and skills that a critical thinker can show.

  5. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  6. Fostering Students' Creativity and Critical Thinking

    Rubrics. A portfolio of rubrics was developed during the OECD-CERI project Fostering and Assessing Creativity and Critical Thinking Skills in Education. Conceptual rubrics were designed to clarify "what counts" or "what sub-skills should be developed" in relation to creativity and critical thinking and to guide the design of lesson ...

  7. Fostering critical thinking: Features of powerful learning environments

    Abstract Critical thinking is a recurrent educational ambition. At the same time, it is not self-evident how that ambition can be realised. ... conditions and interventions—can be used for fostering critical thinking. More specifically, the use of four types of interventions are recommended: (1) modelling, (2) inducing, (3) declaring and (4 ...

  8. The Teaching Exchange: Fostering Critical Thinking

    The Teaching Exchange: Fostering Critical Thinking. This article was originally published in the Fall 1999 issue of the CFT's newsletter, Teaching Forum. The Teaching Exchange is a forum for teachers at Vanderbilt to share their pedagogical strategies, experiments, and discoveries.

  9. Fostering Students' Creativity and Critical Thinking in Science

    3.2.1 Creativity and Critical Thinking. Creativity and critical thinking are two distinct but related higher-order cognitive skills. As such, both require significant mental effort and energy; both are cognitively challenging. Creativity aims to create novel, appropriate ideas and products.

  10. Fostering and assessing creativity and critical thinking in education

    Creativity and critical thinking are key skills for the complex and globalized economies and societies of the 21st century. There is a growing consensus that higher education systems and institutions should cultivate these skills with their students. However, too little is known about what this means for everyday teaching and assessment ...

  11. Fostering Critical Thinking Through Effective Pedagogy

    Fostering Critical Thinking 741 is lower than it should be at every stage of schooling. In a study by Kee-ley, Browne, and Kreutzer (1982), seniors outperformed freshmen in an-alyzing articles through an essay response format despite showing "major deficiencies" in their performance. For instance, 40-60% of the

  12. PDF Fostering Critical Thinking in the Classroom

    foster critical thinking skills in students. In this activity, students participate in a critical thinking exercise on how to evaluate a problem presented and how to respond to a particular letter. Often, the inability to resolve issues or problems is linked to their inability to weed out unimportant or irrelevant information.

  13. Fostering critical thinking in English-as-a-second-language classrooms

    1. Introduction. Critical thinking has become increasingly prominent in language education in the 21 st century (Li, 2016; Van Laar, Van Deursen, Van Dijk, & De Haan, 2017), with research showing that it can both facilitate the acquisition of language skills (Wang & Henderson, 2014; Wu, Marek, & Chen, 2013) and enhance general language proficiency (Liaw, 2007).

  14. The effectiveness of collaborative problem solving in promoting

    The findings show that (1) collaborative problem solving is an effective teaching approach to foster students' critical thinking, with a significant overall effect size ...

  15. Fostering Students' Creativity and Critical Thinking

    Creativity and critical thinking are key skills for complex, globalised and increasingly digitalised economies and societies. While teachers and education policy makers consider creativity and critical thinking as important learning goals, it is still unclear to many what it means to develop these skills in a school setting. To make it more visible and tangible to practitioners, the OECD ...

  16. Fostering Critical Thinking, Reasoning, and Argumentation Skills ...

    Developing a position on a socio-scientific issue and defending it using a well-reasoned justification involves complex cognitive skills that are challenging to both teach and assess. Our work centers on instructional strategies for fostering critical thinking skills in high school students using bioethical case studies, decision-making frameworks, and structured analysis tools to scaffold ...

  17. Systematic review of problem based learning research in fostering

    The study's conclusions indicate that the only form of training that can aid students in fostering their critical thinking skills is problem-based learning. The purpose of this research is to review articles from indexed journals that explore how well PBL model foster critical thinking skills. This paper discusses aim to describing the research ...

  18. Skills for Life: Fostering Critical Thinking

    Critical thinking has become key to the skill set that people should develop not only to have better prospects in the labor market, but also a better personal and civic life. This brief shows how policymakers and teachers can help students develop their critical thinking skills. First, this brief defines critical thinking skills. Then, the brief shows how the concept can be translated into ...

  19. Fostering critical thinking and reflection through blog‐mediated peer

    Fostering critical thinking and reflection through blog-mediated peer feedback. J. Novakovich, Corresponding Author. J. Novakovich. Department of English, Concordia University, Montreal, Quebec, Canada ... with many academics challenging the view that critical thinking can be fostered on social networks. A quasi-experimental study was conducted ...

  20. Fostering Critical Thinking

    Fostering critical thinking skills is becoming one of the chief goals of education, particularly at the college level, where a variety of pedagogic techniques are being used to develop critical thinking skills in students (Irenton, Manduca and Mogk, 1996). These skills are often developed in geoscience students through project-based learning ...

  21. PDF Fostering critical thinking, creativity, and language learning

    Fostering critical thinking, creativity, and language skills in the EFL classroom through problem-based learning Gulcin Cosgun ba *, Derin Atay a Özyeğin University, İstanbul, Turkey b Bahçeşehir University, İstanbul, Turkey Abstract Although problem-based learning (PBL) approach in L2 classrooms might enhance students' critical thinking

  22. PDF Fostering Critical Thinking Skills Using Integrated STEM Approach among

    The lack of critical thinking skills in the nation's future workforce will negatively affect the quest to compete effectively in the global market and also impede the nation's quest for sustainable development. Therefore, this study examined fostering critical thinking skills employing an integrated STEM approach among secondary school

  23. Full article: Fostering student engagement through a real-world

    In particular, active and collaborative learning have been shown to have positive effects on critical thinking and lifelong learning (Kilgo, Ezell Sheets, & Pascarella, Citation 2015). Carlisle, Gourd, Rajkhan, and Nitta ( Citation 2017 ) reported that community-based learning enhances participation and volunteerism, and improves students ...

  24. (Dataset)Fostering Collaboration and Critical Thinking

    The topic focuses on fostering collaboration and critical thinking by integrating generative AI tools in education, emphasizing student-centered approaches, favorable perceptions, and ethical considerations. This entails exploring diverse experiences, investigating educators' roles, and collaborating to develop tailored AI tools for enhanced learning outcomes.

  25. For Emily, #DecolonizingEducation means, "empowering ...

    9 likes, 0 comments - gce_us on May 24, 2023: " For Emily, #DecolonizingEducation means, "empowering students by fostering critical thinking, self-reflection, and a deep understanding of the interconnectedness between local and global issues." #GAWE2023 #20yearsGAWE #DecoloniseEducationFinancing What does #DecolonzingEducation mean to you? Speak up & share: bit.ly/act-gawe23"