Accelerate Learning

  • MISSION / VISION
  • DIVERSITY STATEMENT
  • CAREER OPPORTUNITIES
  • Kide Science
  • STEMscopes Science
  • Collaborate Science
  • STEMscopes Math
  • Math Nation
  • STEMscopes Coding
  • Mastery Coding
  • DIVE-in Engineering
  • STEMscopes Streaming
  • Tuva Data Literacy
  • NATIONAL INSTITUTE FOR STEM EDUCATION
  • STEMSCOPES PROFESSIONAL LEARNING
  • RESEARCH & EFFICACY STUDIES
  • STEM EDUCATION WEBINARS
  • LEARNING EQUITY
  • DISTANCE LEARNING
  • PRODUCT UPDATES
  • LMS INTEGRATIONS
  • STEMSCOPES BLOG
  • FREE RESOURCES
  • TESTIMONIALS

Critical Thinking in Science: Fostering Scientific Reasoning Skills in Students

ALI Staff | Published  July 13, 2023 | Updated December 14, 2023

Thinking like a scientist is a central goal of all science curricula.

As students learn facts, methodologies, and methods, what matters most is that all their learning happens through the lens of scientific reasoning what matters most is that it’s all through the lens of scientific reasoning.

That way, when it comes time for them to take on a little science themselves, either in the lab or by theoretically thinking through a solution, they understand how to do it in the right context.

One component of this type of thinking is being critical. Based on facts and evidence, critical thinking in science isn’t exactly the same as critical thinking in other subjects.

Students have to doubt the information they’re given until they can prove it’s right.

They have to truly understand what’s true and what’s hearsay. It’s complex, but with the right tools and plenty of practice, students can get it right.

What is critical thinking?

This particular style of thinking stands out because it requires reflection and analysis. Based on what's logical and rational, thinking critically is all about digging deep and going beyond the surface of a question to establish the quality of the question itself.

It ensures students put their brains to work when confronted with a question rather than taking every piece of information they’re given at face value.

It’s engaged, higher-level thinking that will serve them well in school and throughout their lives.

Why is critical thinking important?

Critical thinking is important when it comes to making good decisions.

It gives us the tools to think through a choice rather than quickly picking an option — and probably guessing wrong. Think of it as the all-important ‘why.’

Why is that true? Why is that right? Why is this the only option?

Finding answers to questions like these requires critical thinking. They require you to really analyze both the question itself and the possible solutions to establish validity.

Will that choice work for me? Does this feel right based on the evidence?

How does critical thinking in science impact students?

Critical thinking is essential in science.

It’s what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought.

It’s not just about whether evidence is available to support a particular answer but how valid that evidence is.

It’s about whether the information the student has fits together to create a strong argument and how to use verifiable facts to get a proper response.

Critical thinking in science helps students:

  • Actively evaluate information
  • Identify bias
  • Separate the logic within arguments
  • Analyze evidence

4 Ways to promote critical thinking

Figuring out how to develop critical thinking skills in science means looking at multiple strategies and deciding what will work best at your school and in your class.

Based on your student population, their needs and abilities, not every option will be a home run.

These particular examples are all based on the idea that for students to really learn how to think critically, they have to practice doing it. 

Each focuses on engaging students with science in a way that will motivate them to work independently as they hone their scientific reasoning skills.

Project-Based Learning

Project-based learning centers on critical thinking.

Teachers can shape a project around the thinking style to give students practice with evaluating evidence or other critical thinking skills.

Critical thinking also happens during collaboration, evidence-based thought, and reflection.

For example, setting students up for a research project is not only a great way to get them to think critically, but it also helps motivate them to learn.

Allowing them to pick the topic (that isn’t easy to look up online), develop their own research questions, and establish a process to collect data to find an answer lets students personally connect to science while using critical thinking at each stage of the assignment.

They’ll have to evaluate the quality of the research they find and make evidence-based decisions.

Self-Reflection

Adding a question or two to any lab practicum or activity requiring students to pause and reflect on what they did or learned also helps them practice critical thinking.

At this point in an assignment, they’ll pause and assess independently. 

You can ask students to reflect on the conclusions they came up with for a completed activity, which really makes them think about whether there's any bias in their answer.

Addressing Assumptions

One way critical thinking aligns so perfectly with scientific reasoning is that it encourages students to challenge all assumptions. 

Evidence is king in the science classroom, but even when students work with hard facts, there comes the risk of a little assumptive thinking.

Working with students to identify assumptions in existing research or asking them to address an issue where they suspend their own judgment and simply look at established facts polishes their that critical eye.

They’re getting practice without tossing out opinions, unproven hypotheses, and speculation in exchange for real data and real results, just like a scientist has to do.

Lab Activities With Trial-And-Error

Another component of critical thinking (as well as thinking like a scientist) is figuring out what to do when you get something wrong.

Backtracking can mean you have to rethink a process, redesign an experiment, or reevaluate data because the outcomes don’t make sense, but it’s okay.

The ability to get something wrong and recover is not only a valuable life skill, but it’s where most scientific breakthroughs start. Reminding students of this is always a valuable lesson.

Labs that include comparative activities are one way to increase critical thinking skills, especially when introducing new evidence that might cause students to change their conclusions once the lab has begun.

For example, you provide students with two distinct data sets and ask them to compare them.

With only two choices, there are a finite amount of conclusions to draw, but then what happens when you bring in a third data set? Will it void certain conclusions? Will it allow students to make new conclusions, ones even more deeply rooted in evidence?

Thinking like a scientist

When students get the opportunity to think critically, they’re learning to trust the data over their ‘gut,’ to approach problems systematically and make informed decisions using ‘good’ evidence.

When practiced enough, this ability will engage students in science in a whole new way, providing them with opportunities to dig deeper and learn more.

It can help enrich science and motivate students to approach the subject just like a professional would.

New call-to-action

Share this post!

Related articles.

How To Make Math Fun: 8 Ways To Teach Math Through Play

How To Make Math Fun: 8 Ways To Teach Math Through Play

Welcome to a space where math is not just about right answers but about sparking joy and curiosity. As we blend play...

Engineering in Elementary School

Engineering in Elementary School

Integrating engineering concepts into elementary education is a crucial topic in this day and age.

It's about...

Top 7 Benefits of Play-Based Learning In Early Childhood Education

Top 7 Benefits of Play-Based Learning In Early Childhood Education

The idea of a play-based curriculum may sound counterintuitive.

“Play” suggests giving children free rein to explore...

STAY INFORMED ON THE LATEST IN STEM. SUBSCRIBE TODAY!

Which stem subjects are of interest to you.

STEMscopes Tech Specifications      STEMscopes Security Information & Compliance      Privacy Policy      Terms and Conditions

© 2024 Accelerate Learning  

41+ Critical Thinking Examples (Definition + Practices)

practical psychology logo

Critical thinking is an essential skill in our information-overloaded world, where figuring out what is fact and fiction has become increasingly challenging.

But why is critical thinking essential? Put, critical thinking empowers us to make better decisions, challenge and validate our beliefs and assumptions, and understand and interact with the world more effectively and meaningfully.

Critical thinking is like using your brain's "superpowers" to make smart choices. Whether it's picking the right insurance, deciding what to do in a job, or discussing topics in school, thinking deeply helps a lot. In the next parts, we'll share real-life examples of when this superpower comes in handy and give you some fun exercises to practice it.

Critical Thinking Process Outline

a woman thinking

Critical thinking means thinking clearly and fairly without letting personal feelings get in the way. It's like being a detective, trying to solve a mystery by using clues and thinking hard about them.

It isn't always easy to think critically, as it can take a pretty smart person to see some of the questions that aren't being answered in a certain situation. But, we can train our brains to think more like puzzle solvers, which can help develop our critical thinking skills.

Here's what it looks like step by step:

Spotting the Problem: It's like discovering a puzzle to solve. You see that there's something you need to figure out or decide.

Collecting Clues: Now, you need to gather information. Maybe you read about it, watch a video, talk to people, or do some research. It's like getting all the pieces to solve your puzzle.

Breaking It Down: This is where you look at all your clues and try to see how they fit together. You're asking questions like: Why did this happen? What could happen next?

Checking Your Clues: You want to make sure your information is good. This means seeing if what you found out is true and if you can trust where it came from.

Making a Guess: After looking at all your clues, you think about what they mean and come up with an answer. This answer is like your best guess based on what you know.

Explaining Your Thoughts: Now, you tell others how you solved the puzzle. You explain how you thought about it and how you answered. 

Checking Your Work: This is like looking back and seeing if you missed anything. Did you make any mistakes? Did you let any personal feelings get in the way? This step helps make sure your thinking is clear and fair.

And remember, you might sometimes need to go back and redo some steps if you discover something new. If you realize you missed an important clue, you might have to go back and collect more information.

Critical Thinking Methods

Just like doing push-ups or running helps our bodies get stronger, there are special exercises that help our brains think better. These brain workouts push us to think harder, look at things closely, and ask many questions.

It's not always about finding the "right" answer. Instead, it's about the journey of thinking and asking "why" or "how." Doing these exercises often helps us become better thinkers and makes us curious to know more about the world.

Now, let's look at some brain workouts to help us think better:

1. "What If" Scenarios

Imagine crazy things happening, like, "What if there was no internet for a month? What would we do?" These games help us think of new and different ideas.

Pick a hot topic. Argue one side of it and then try arguing the opposite. This makes us see different viewpoints and think deeply about a topic.

3. Analyze Visual Data

Check out charts or pictures with lots of numbers and info but no explanations. What story are they telling? This helps us get better at understanding information just by looking at it.

4. Mind Mapping

Write an idea in the center and then draw lines to related ideas. It's like making a map of your thoughts. This helps us see how everything is connected.

There's lots of mind-mapping software , but it's also nice to do this by hand.

5. Weekly Diary

Every week, write about what happened, the choices you made, and what you learned. Writing helps us think about our actions and how we can do better.

6. Evaluating Information Sources

Collect stories or articles about one topic from newspapers or blogs. Which ones are trustworthy? Which ones might be a little biased? This teaches us to be smart about where we get our info.

There are many resources to help you determine if information sources are factual or not.

7. Socratic Questioning

This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic. You can do this by yourself or chat with a friend.

Start with a Big Question:

"What does 'success' mean?"

Dive Deeper with More Questions:

"Why do you think of success that way?" "Do TV shows, friends, or family make you think that?" "Does everyone think about success the same way?"

"Can someone be a winner even if they aren't rich or famous?" "Can someone feel like they didn't succeed, even if everyone else thinks they did?"

Look for Real-life Examples:

"Who is someone you think is successful? Why?" "Was there a time you felt like a winner? What happened?"

Think About Other People's Views:

"How might a person from another country think about success?" "Does the idea of success change as we grow up or as our life changes?"

Think About What It Means:

"How does your idea of success shape what you want in life?" "Are there problems with only wanting to be rich or famous?"

Look Back and Think:

"After talking about this, did your idea of success change? How?" "Did you learn something new about what success means?"

socratic dialogue statues

8. Six Thinking Hats 

Edward de Bono came up with a cool way to solve problems by thinking in six different ways, like wearing different colored hats. You can do this independently, but it might be more effective in a group so everyone can have a different hat color. Each color has its way of thinking:

White Hat (Facts): Just the facts! Ask, "What do we know? What do we need to find out?"

Red Hat (Feelings): Talk about feelings. Ask, "How do I feel about this?"

Black Hat (Careful Thinking): Be cautious. Ask, "What could go wrong?"

Yellow Hat (Positive Thinking): Look on the bright side. Ask, "What's good about this?"

Green Hat (Creative Thinking): Think of new ideas. Ask, "What's another way to look at this?"

Blue Hat (Planning): Organize the talk. Ask, "What should we do next?"

When using this method with a group:

  • Explain all the hats.
  • Decide which hat to wear first.
  • Make sure everyone switches hats at the same time.
  • Finish with the Blue Hat to plan the next steps.

9. SWOT Analysis

SWOT Analysis is like a game plan for businesses to know where they stand and where they should go. "SWOT" stands for Strengths, Weaknesses, Opportunities, and Threats.

There are a lot of SWOT templates out there for how to do this visually, but you can also think it through. It doesn't just apply to businesses but can be a good way to decide if a project you're working on is working.

Strengths: What's working well? Ask, "What are we good at?"

Weaknesses: Where can we do better? Ask, "Where can we improve?"

Opportunities: What good things might come our way? Ask, "What chances can we grab?"

Threats: What challenges might we face? Ask, "What might make things tough for us?"

Steps to do a SWOT Analysis:

  • Goal: Decide what you want to find out.
  • Research: Learn about your business and the world around it.
  • Brainstorm: Get a group and think together. Talk about strengths, weaknesses, opportunities, and threats.
  • Pick the Most Important Points: Some things might be more urgent or important than others.
  • Make a Plan: Decide what to do based on your SWOT list.
  • Check Again Later: Things change, so look at your SWOT again after a while to update it.

Now that you have a few tools for thinking critically, let’s get into some specific examples.

Everyday Examples

Life is a series of decisions. From the moment we wake up, we're faced with choices – some trivial, like choosing a breakfast cereal, and some more significant, like buying a home or confronting an ethical dilemma at work. While it might seem that these decisions are disparate, they all benefit from the application of critical thinking.

10. Deciding to buy something

Imagine you want a new phone. Don't just buy it because the ad looks cool. Think about what you need in a phone. Look up different phones and see what people say about them. Choose the one that's the best deal for what you want.

11. Deciding what is true

There's a lot of news everywhere. Don't believe everything right away. Think about why someone might be telling you this. Check if what you're reading or watching is true. Make up your mind after you've looked into it.

12. Deciding when you’re wrong

Sometimes, friends can have disagreements. Don't just get mad right away. Try to see where they're coming from. Talk about what's going on. Find a way to fix the problem that's fair for everyone.

13. Deciding what to eat

There's always a new diet or exercise that's popular. Don't just follow it because it's trendy. Find out if it's good for you. Ask someone who knows, like a doctor. Make choices that make you feel good and stay healthy.

14. Deciding what to do today

Everyone is busy with school, chores, and hobbies. Make a list of things you need to do. Decide which ones are most important. Plan your day so you can get things done and still have fun.

15. Making Tough Choices

Sometimes, it's hard to know what's right. Think about how each choice will affect you and others. Talk to people you trust about it. Choose what feels right in your heart and is fair to others.

16. Planning for the Future

Big decisions, like where to go to school, can be tricky. Think about what you want in the future. Look at the good and bad of each choice. Talk to people who know about it. Pick what feels best for your dreams and goals.

choosing a house

Job Examples

17. solving problems.

Workers brainstorm ways to fix a machine quickly without making things worse when a machine breaks at a factory.

18. Decision Making

A store manager decides which products to order more of based on what's selling best.

19. Setting Goals

A team leader helps their team decide what tasks are most important to finish this month and which can wait.

20. Evaluating Ideas

At a team meeting, everyone shares ideas for a new project. The group discusses each idea's pros and cons before picking one.

21. Handling Conflict

Two workers disagree on how to do a job. Instead of arguing, they talk calmly, listen to each other, and find a solution they both like.

22. Improving Processes

A cashier thinks of a faster way to ring up items so customers don't have to wait as long.

23. Asking Questions

Before starting a big task, an employee asks for clear instructions and checks if they have the necessary tools.

24. Checking Facts

Before presenting a report, someone double-checks all their information to make sure there are no mistakes.

25. Planning for the Future

A business owner thinks about what might happen in the next few years, like new competitors or changes in what customers want, and makes plans based on those thoughts.

26. Understanding Perspectives

A team is designing a new toy. They think about what kids and parents would both like instead of just what they think is fun.

School Examples

27. researching a topic.

For a history project, a student looks up different sources to understand an event from multiple viewpoints.

28. Debating an Issue

In a class discussion, students pick sides on a topic, like school uniforms, and share reasons to support their views.

29. Evaluating Sources

While writing an essay, a student checks if the information from a website is trustworthy or might be biased.

30. Problem Solving in Math

When stuck on a tricky math problem, a student tries different methods to find the answer instead of giving up.

31. Analyzing Literature

In English class, students discuss why a character in a book made certain choices and what those decisions reveal about them.

32. Testing a Hypothesis

For a science experiment, students guess what will happen and then conduct tests to see if they're right or wrong.

33. Giving Peer Feedback

After reading a classmate's essay, a student offers suggestions for improving it.

34. Questioning Assumptions

In a geography lesson, students consider why certain countries are called "developed" and what that label means.

35. Designing a Study

For a psychology project, students plan an experiment to understand how people's memories work and think of ways to ensure accurate results.

36. Interpreting Data

In a science class, students look at charts and graphs from a study, then discuss what the information tells them and if there are any patterns.

Critical Thinking Puzzles

critical thinking tree

Not all scenarios will have a single correct answer that can be figured out by thinking critically. Sometimes we have to think critically about ethical choices or moral behaviors. 

Here are some mind games and scenarios you can solve using critical thinking. You can see the solution(s) at the end of the post.

37. The Farmer, Fox, Chicken, and Grain Problem

A farmer is at a riverbank with a fox, a chicken, and a grain bag. He needs to get all three items across the river. However, his boat can only carry himself and one of the three items at a time. 

Here's the challenge:

  • If the fox is left alone with the chicken, the fox will eat the chicken.
  • If the chicken is left alone with the grain, the chicken will eat the grain.

How can the farmer get all three items across the river without any item being eaten? 

38. The Rope, Jar, and Pebbles Problem

You are in a room with two long ropes hanging from the ceiling. Each rope is just out of arm's reach from the other, so you can't hold onto one rope and reach the other simultaneously. 

Your task is to tie the two rope ends together, but you can't move the position where they hang from the ceiling.

You are given a jar full of pebbles. How do you complete the task?

39. The Two Guards Problem

Imagine there are two doors. One door leads to certain doom, and the other leads to freedom. You don't know which is which.

In front of each door stands a guard. One guard always tells the truth. The other guard always lies. You don't know which guard is which.

You can ask only one question to one of the guards. What question should you ask to find the door that leads to freedom?

40. The Hourglass Problem

You have two hourglasses. One measures 7 minutes when turned over, and the other measures 4 minutes. Using just these hourglasses, how can you time exactly 9 minutes?

41. The Lifeboat Dilemma

Imagine you're on a ship that's sinking. You get on a lifeboat, but it's already too full and might flip over. 

Nearby in the water, five people are struggling: a scientist close to finding a cure for a sickness, an old couple who've been together for a long time, a mom with three kids waiting at home, and a tired teenager who helped save others but is now in danger. 

You can only save one person without making the boat flip. Who would you choose?

42. The Tech Dilemma

You work at a tech company and help make a computer program to help small businesses. You're almost ready to share it with everyone, but you find out there might be a small chance it has a problem that could show users' private info. 

If you decide to fix it, you must wait two more months before sharing it. But your bosses want you to share it now. What would you do?

43. The History Mystery

Dr. Amelia is a history expert. She's studying where a group of people traveled long ago. She reads old letters and documents to learn about it. But she finds some letters that tell a different story than what most people believe. 

If she says this new story is true, it could change what people learn in school and what they think about history. What should she do?

The Role of Bias in Critical Thinking

Have you ever decided you don’t like someone before you even know them? Or maybe someone shared an idea with you that you immediately loved without even knowing all the details. 

This experience is called bias, which occurs when you like or dislike something or someone without a good reason or knowing why. It can also take shape in certain reactions to situations, like a habit or instinct. 

Bias comes from our own experiences, what friends or family tell us, or even things we are born believing. Sometimes, bias can help us stay safe, but other times it stops us from seeing the truth.

Not all bias is bad. Bias can be a mechanism for assessing our potential safety in a new situation. If we are biased to think that anything long, thin, and curled up is a snake, we might assume the rope is something to be afraid of before we know it is just a rope.

While bias might serve us in some situations (like jumping out of the way of an actual snake before we have time to process that we need to be jumping out of the way), it often harms our ability to think critically.

How Bias Gets in the Way of Good Thinking

Selective Perception: We only notice things that match our ideas and ignore the rest. 

It's like only picking red candies from a mixed bowl because you think they taste the best, but they taste the same as every other candy in the bowl. It could also be when we see all the signs that our partner is cheating on us but choose to ignore them because we are happy the way we are (or at least, we think we are).

Agreeing with Yourself: This is called “ confirmation bias ” when we only listen to ideas that match our own and seek, interpret, and remember information in a way that confirms what we already think we know or believe. 

An example is when someone wants to know if it is safe to vaccinate their children but already believes that vaccines are not safe, so they only look for information supporting the idea that vaccines are bad.

Thinking We Know It All: Similar to confirmation bias, this is called “overconfidence bias.” Sometimes we think our ideas are the best and don't listen to others. This can stop us from learning.

Have you ever met someone who you consider a “know it”? Probably, they have a lot of overconfidence bias because while they may know many things accurately, they can’t know everything. Still, if they act like they do, they show overconfidence bias.

There's a weird kind of bias similar to this called the Dunning Kruger Effect, and that is when someone is bad at what they do, but they believe and act like they are the best .

Following the Crowd: This is formally called “groupthink”. It's hard to speak up with a different idea if everyone agrees. But this can lead to mistakes.

An example of this we’ve all likely seen is the cool clique in primary school. There is usually one person that is the head of the group, the “coolest kid in school”, and everyone listens to them and does what they want, even if they don’t think it’s a good idea.

How to Overcome Biases

Here are a few ways to learn to think better, free from our biases (or at least aware of them!).

Know Your Biases: Realize that everyone has biases. If we know about them, we can think better.

Listen to Different People: Talking to different kinds of people can give us new ideas.

Ask Why: Always ask yourself why you believe something. Is it true, or is it just a bias?

Understand Others: Try to think about how others feel. It helps you see things in new ways.

Keep Learning: Always be curious and open to new information.

city in a globe connection

In today's world, everything changes fast, and there's so much information everywhere. This makes critical thinking super important. It helps us distinguish between what's real and what's made up. It also helps us make good choices. But thinking this way can be tough sometimes because of biases. These are like sneaky thoughts that can trick us. The good news is we can learn to see them and think better.

There are cool tools and ways we've talked about, like the "Socratic Questioning" method and the "Six Thinking Hats." These tools help us get better at thinking. These thinking skills can also help us in school, work, and everyday life.

We’ve also looked at specific scenarios where critical thinking would be helpful, such as deciding what diet to follow and checking facts.

Thinking isn't just a skill—it's a special talent we improve over time. Working on it lets us see things more clearly and understand the world better. So, keep practicing and asking questions! It'll make you a smarter thinker and help you see the world differently.

Critical Thinking Puzzles (Solutions)

The farmer, fox, chicken, and grain problem.

  • The farmer first takes the chicken across the river and leaves it on the other side.
  • He returns to the original side and takes the fox across the river.
  • After leaving the fox on the other side, he returns the chicken to the starting side.
  • He leaves the chicken on the starting side and takes the grain bag across the river.
  • He leaves the grain with the fox on the other side and returns to get the chicken.
  • The farmer takes the chicken across, and now all three items -- the fox, the chicken, and the grain -- are safely on the other side of the river.

The Rope, Jar, and Pebbles Problem

  • Take one rope and tie the jar of pebbles to its end.
  • Swing the rope with the jar in a pendulum motion.
  • While the rope is swinging, grab the other rope and wait.
  • As the swinging rope comes back within reach due to its pendulum motion, grab it.
  • With both ropes within reach, untie the jar and tie the rope ends together.

The Two Guards Problem

The question is, "What would the other guard say is the door to doom?" Then choose the opposite door.

The Hourglass Problem

  • Start both hourglasses. 
  • When the 4-minute hourglass runs out, turn it over.
  • When the 7-minute hourglass runs out, the 4-minute hourglass will have been running for 3 minutes. Turn the 7-minute hourglass over. 
  • When the 4-minute hourglass runs out for the second time (a total of 8 minutes have passed), the 7-minute hourglass will run for 1 minute. Turn the 7-minute hourglass again for 1 minute to empty the hourglass (a total of 9 minutes passed).

The Boat and Weights Problem

Take the cat over first and leave it on the other side. Then, return and take the fish across next. When you get there, take the cat back with you. Leave the cat on the starting side and take the cat food across. Lastly, return to get the cat and bring it to the other side.

The Lifeboat Dilemma

There isn’t one correct answer to this problem. Here are some elements to consider:

  • Moral Principles: What values guide your decision? Is it the potential greater good for humanity (the scientist)? What is the value of long-standing love and commitment (the elderly couple)? What is the future of young children who depend on their mothers? Or the selfless bravery of the teenager?
  • Future Implications: Consider the future consequences of each choice. Saving the scientist might benefit millions in the future, but what moral message does it send about the value of individual lives?
  • Emotional vs. Logical Thinking: While it's essential to engage empathy, it's also crucial not to let emotions cloud judgment entirely. For instance, while the teenager's bravery is commendable, does it make him more deserving of a spot on the boat than the others?
  • Acknowledging Uncertainty: The scientist claims to be close to a significant breakthrough, but there's no certainty. How does this uncertainty factor into your decision?
  • Personal Bias: Recognize and challenge any personal biases, such as biases towards age, profession, or familial status.

The Tech Dilemma

Again, there isn’t one correct answer to this problem. Here are some elements to consider:

  • Evaluate the Risk: How severe is the potential vulnerability? Can it be easily exploited, or would it require significant expertise? Even if the circumstances are rare, what would be the consequences if the vulnerability were exploited?
  • Stakeholder Considerations: Different stakeholders will have different priorities. Upper management might prioritize financial projections, the marketing team might be concerned about the product's reputation, and customers might prioritize the security of their data. How do you balance these competing interests?
  • Short-Term vs. Long-Term Implications: While launching on time could meet immediate financial goals, consider the potential long-term damage to the company's reputation if the vulnerability is exploited. Would the short-term gains be worth the potential long-term costs?
  • Ethical Implications : Beyond the financial and reputational aspects, there's an ethical dimension to consider. Is it right to release a product with a known vulnerability, even if the chances of it being exploited are low?
  • Seek External Input: Consulting with cybersecurity experts outside your company might be beneficial. They could provide a more objective risk assessment and potential mitigation strategies.
  • Communication: How will you communicate the decision, whatever it may be, both internally to your team and upper management and externally to your customers and potential users?

The History Mystery

Dr. Amelia should take the following steps:

  • Verify the Letters: Before making any claims, she should check if the letters are actual and not fake. She can do this by seeing when and where they were written and if they match with other things from that time.
  • Get a Second Opinion: It's always good to have someone else look at what you've found. Dr. Amelia could show the letters to other history experts and see their thoughts.
  • Research More: Maybe there are more documents or letters out there that support this new story. Dr. Amelia should keep looking to see if she can find more evidence.
  • Share the Findings: If Dr. Amelia believes the letters are true after all her checks, she should tell others. This can be through books, talks, or articles.
  • Stay Open to Feedback: Some people might agree with Dr. Amelia, and others might not. She should listen to everyone and be ready to learn more or change her mind if new information arises.

Ultimately, Dr. Amelia's job is to find out the truth about history and share it. It's okay if this new truth differs from what people used to believe. History is about learning from the past, no matter the story.

Related posts:

  • Experimenter Bias (Definition + Examples)
  • Hasty Generalization Fallacy (31 Examples + Similar Names)
  • Ad Hoc Fallacy (29 Examples + Other Names)
  • Confirmation Bias (Examples + Definition)
  • Equivocation Fallacy (26 Examples + Description)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Prevent plagiarism. Run a free check.

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

examples of critical thinking in science

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved March 25, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

examples of critical thinking in science

3. Critical Thinking in Science: How to Foster Scientific Reasoning Skills

Critical thinking in science is important largely because a lot of students have developed expectations about science that can prove to be counter-productive. 

After various experiences — both in school and out — students often perceive science to be primarily about learning “authoritative” content knowledge: this is how the solar system works; that is how diffusion works; this is the right answer and that is not. 

This perception allows little room for critical thinking in science, in spite of the fact that argument, reasoning, and critical thinking lie at the very core of scientific practice.

Argument, reasoning, and critical thinking lie at the very core of scientific practice.

examples of critical thinking in science

In this article, we outline two of the best approaches to be most effective in fostering scientific reasoning. Both try to put students in a scientist’s frame of mind more than is typical in science education:

  • First, we look at  small-group inquiry , where students formulate questions and investigate them in small groups. This approach is geared more toward younger students but has applications at higher levels too.
  • We also look  science   labs . Too often, science labs too often involve students simply following recipes or replicating standard results. Here, we offer tips to turn labs into spaces for independent inquiry and scientific reasoning.

examples of critical thinking in science

I. Critical Thinking in Science and Scientific Inquiry

Even very young students can “think scientifically” under the right instructional support. A series of experiments , for instance, established that preschoolers can make statistically valid inferences about unknown variables. Through observation they are also capable of distinguishing actions that cause certain outcomes from actions that don’t. These innate capacities, however, have to be developed for students to grow up into rigorous scientific critical thinkers. 

Even very young students can “think scientifically” under the right instructional support.

Although there are many techniques to get young children involved in scientific inquiry — encouraging them to ask and answer “why” questions, for instance — teachers can provide structured scientific inquiry experiences that are deeper than students can experience on their own. 

Goals for Teaching Critical Thinking Through Scientific Inquiry

When it comes to teaching critical thinking via science, the learning goals may vary, but students should learn that:

  • Failure to agree is okay, as long as you have reasons for why you disagree about something.
  • The logic of scientific inquiry is iterative. Scientists always have to consider how they might improve your methods next time. This includes addressing sources of uncertainty.
  • Claims to knowledge usually require multiple lines of evidence and a “match” or “fit” between our explanations and the evidence we have.
  • Collaboration, argument, and discussion are central features of scientific reasoning.
  • Visualization, analysis, and presentation are central features of scientific reasoning.
  • Overarching concepts in scientific practice — such as uncertainty, measurement, and meaningful experimental contrasts — manifest themselves somewhat differently in different scientific domains.

How to Teaching Critical Thinking in Science Via Inquiry

Sometimes we think of science education as being either a “direct” approach, where we tell students about a concept, or an “inquiry-based” approach, where students explore a concept themselves.  

But, especially, at the earliest grades, integrating both approaches can inform students of their options (i.e., generate and extend their ideas), while also letting students make decisions about what to do.

Like a lot of projects targeting critical thinking, limited classroom time is a challenge. Although the latest content standards, such as the Next Generation Science Standards , emphasize teaching scientific practices, many standardized tests still emphasize assessing scientific content knowledge.

The concept of uncertainty comes up in every scientific domain.

Creating a lesson that targets the right content is also an important aspect of developing authentic scientific experiences. It’s now more  widely acknowledged  that effective science instruction involves the interaction between domain-specific knowledge and domain-general knowledge, and that linking an inquiry experience to appropriate target content is vital.

For instance, the concept of uncertainty  comes up  in every scientific domain. But the sources of uncertainty coming from any given measurement vary tremendously by discipline. It requires content knowledge to know how to wisely apply the concept of uncertainty.

Tips and Challenges for teaching critical thinking in science

Teachers need to grapple with student misconceptions. Student intuition about how the world works — the way living things grow and behave, the way that objects fall and interact — often conflicts with scientific explanations. As part of the inquiry experience, teachers can help students to articulate these intuitions and revise them through argument and evidence.

Group composition is another challenge. Teachers will want to avoid situations where one member of the group will simply “take charge” of the decision-making, while other member(s) disengage. In some cases, grouping students by current ability level can make the group work more productive. 

Another approach is to establish group norms that help prevent unproductive group interactions. A third tactic is to have each group member learn an essential piece of the puzzle prior to the group work, so that each member is bringing something valuable to the table (which other group members don’t yet know).

It’s critical to ask students about how certain they are in their observations and explanations and what they could do better next time. When disagreements arise about what to do next or how to interpret evidence, the instructor should model good scientific practice by, for instance, getting students to think about what kind of evidence would help resolve the disagreement or whether there’s a compromise that might satisfy both groups.

The subjects of the inquiry experience and the tools at students’ disposal will depend upon the class and the grade level. Older students may be asked to create mathematical models, more sophisticated visualizations, and give fuller presentations of their results.

Lesson Plan Outline

This lesson plan takes a small-group inquiry approach to critical thinking in science. It asks students to collaboratively explore a scientific question, or perhaps a series of related questions, within a scientific domain.

Suppose students are exploring insect behavior. Groups may decide what questions to ask about insect behavior; how to observe, define, and record insect behavior; how to design an experiment that generates evidence related to their research questions; and how to interpret and present their results.

An in-depth inquiry experience usually takes place over the course of several classroom sessions, and includes classroom-wide instruction, small-group work, and potentially some individual work as well.

Students, especially younger students, will typically need some background knowledge that can inform more independent decision-making. So providing classroom-wide instruction and discussion before individual group work is a good idea.

For instance, Kathleen Metz had students observe insect behavior, explore the anatomy of insects, draw habitat maps, and collaboratively formulate (and categorize) research questions before students began to work more independently.

The subjects of a science inquiry experience can vary tremendously: local weather patterns, plant growth, pollution, bridge-building. The point is to engage students in multiple aspects of scientific practice: observing, formulating research questions, making predictions, gathering data, analyzing and interpreting data, refining and iterating the process.

As student groups take responsibility for their own investigation, teachers act as facilitators. They can circulate around the room, providing advice and guidance to individual groups. If classroom-wide misconceptions arise, they can pause group work to address those misconceptions directly and re-orient the class toward a more productive way of thinking.

Throughout the process, teachers can also ask questions like:

  • What are your assumptions about what’s going on? How can you check your assumptions?
  • Suppose that your results show X, what would you conclude?
  • If you had to do the process over again, what would you change? Why?

examples of critical thinking in science

II. Rethinking Science Labs

Beyond changing how students approach scientific inquiry, we also need to rethink science labs. After all, science lab activities are ubiquitous in science classrooms and they are a great opportunity to teach critical thinking skills.

Often, however, science labs are merely recipes that students follow to verify standard values (such as the force of acceleration due to gravity) or relationships between variables (such as the relationship between force, mass, and acceleration) known to the students beforehand. 

This approach does not usually involve critical thinking: students are not making many decisions during the process, and they do not reflect on what they’ve done except to see whether their experimental data matches the expected values.

With some small tweaks, however, science labs can involve more critical thinking. Science lab activities that give students not only the opportunity to design, analyze, and interpret the experiment, but re -design, re -analyze, and re -interpret the experiment provides ample opportunity for grappling with evidence and evidence-model relationships, particularly if students don’t know what answer they should be expecting beforehand.

Such activities improve scientific reasoning skills, such as: 

  • Evaluating quantitative data
  • Plausible scientific explanations for observed patterns

And also broader critical thinking skills, like:

  • Comparing models to data, and comparing models to each other
  • Thinking about what kind of evidence supports one model or another
  • Being open to changing your beliefs based on evidence

Traditional science lab experiences bear little resemblance to actual scientific practice. Actual practice  involves  decision-making under uncertainty, trial-and-error, tweaking experimental methods over time, testing instruments, and resolving conflicts among different kinds of evidence. Traditional in-school science labs rarely involve these things.

Traditional science lab experiences bear little resemblance to actual scientific practice.

When teachers use science labs as opportunities to engage students in the kinds of dilemmas that scientists actually face during research, students make more decisions and exhibit more sophisticated reasoning.

In the lesson plan below, students are asked to evaluate two models of drag forces on a falling object. One model assumes that drag increases linearly with the velocity of the falling object. Another model assumes that drag increases quadratically (e.g., with the square of the velocity).  Students use a motion detector and computer software to create a plot of the position of a disposable paper coffee filter as it falls to the ground. Among other variables, students can vary the number of coffee filters they drop at once, the height at which they drop them, how they drop  them, and how they clean their data. This is an approach to scaffolding critical thinking: a way to get students to ask the right kinds of questions and think in the way that scientists tend to think.

Design an experiment to test which model best characterizes the motion of the coffee filters. 

Things to think about in your design:

  • What are the relevant variables to control and which ones do you need to explore?
  • What are some logistical issues associated with the data collection that may cause unnecessary variability (either random or systematic) or mistakes?
  • How can you control or measure these?
  • What ways can you graph your data and which ones will help you figure out which model better describes your data?

Discuss your design with other groups and modify as you see fit.

Initial data collection

Conduct a quick trial-run of your experiment so that you can evaluate your methods.

  • Do your graphs provide evidence of which model is the best?
  • What ways can you improve your methods, data, or graphs to make your case more convincing?
  • Do you need to change how you’re collecting data?
  • Do you need to take data at different regions?
  • Do you just need more data?
  • Do you need to reduce your uncertainty?

After this initial evaluation of your data and methods, conduct the desired improvements, changes, or additions and re-evaluate at the end.

In your lab notes, make sure to keep track of your progress and process as you go. As always, your final product is less important than how you get there.

How to Make Science Labs Run Smoothly

Managing student expectations . As with many other lesson plans that incorporate critical thinking, students are not used to having so much freedom. As with the example lesson plan above, it’s important to scaffold student decision-making by pointing out what decisions have to be made, especially as students are transitioning to this approach.

Supporting student reasoning . Another challenge is to provide guidance to student groups without telling them how to do something. Too much “telling” diminishes student decision-making, but not enough support may leave students simply not knowing what to do. 

There are several key strategies teachers can try out here: 

  • Point out an issue with their data collection process without specifying exactly how to solve it.
  • Ask a lab group how they would improve their approach.
  • Ask two groups with conflicting results to compare their results, methods, and analyses.

Download our Teachers’ Guide

(please click here)

Sources and Resources

Lehrer, R., & Schauble, L. (2007). Scientific thinking and scientific literacy . Handbook of child psychology , Vol. 4. Wiley. A review of research on scientific thinking and experiments on teaching scientific thinking in the classroom.

Metz, K. (2004). Children’s understanding of scientific inquiry: Their conceptualizations of uncertainty in investigations of their own design . Cognition and Instruction 22(2). An example of a scientific inquiry experience for elementary school students.

The Next Generation Science Standards . The latest U.S. science content standards.

Concepts of Evidence A collection of important concepts related to evidence that cut across scientific disciplines.

Scienceblind A book about children’s science misconceptions and how to correct them.

Holmes, N. G., Keep, B., & Wieman, C. E. (2020). Developing scientific decision making by structuring and supporting student agency. Physical Review Physics Education Research , 16 (1), 010109. A research study on minimally altering traditional lab approaches to incorporate more critical thinking. The drag example was taken from this piece.

ISLE , led by E. Etkina.  A platform that helps teachers incorporate more critical thinking in physics labs.

Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking . Proceedings of the National Academy of Sciences , 112 (36), 11199-11204. An approach to improving critical thinking and reflection in science labs. Walker, J. P., Sampson, V., Grooms, J., Anderson, B., & Zimmerman, C. O. (2012). Argument-driven inquiry in undergraduate chemistry labs: The impact on students’ conceptual understanding, argument skills, and attitudes toward science . Journal of College Science Teaching , 41 (4), 74-81. A large-scale research study on transforming chemistry labs to be more inquiry-based.

Privacy Overview

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Your browser is not supported

Sorry but it looks as if your browser is out of date. To get the best experience using our site we recommend that you upgrade or switch browsers.

Find a solution

  • Skip to main content
  • Skip to navigation

examples of critical thinking in science

  • Back to parent navigation item
  • Collections
  • Sustainability in chemistry
  • Simple rules
  • Teacher well-being hub
  • Women in chemistry
  • Global science
  • Escape room activities
  • Decolonising chemistry teaching
  • Teaching science skills
  • Post-lockdown teaching support
  • Get the print issue
  • RSC Education

Three cartoons: a female student thinking about concentration, a male student in a wheelchair reading Frankenstein and a female student wearing a headscarf and safety goggles heating a test tube on a bunsen burner. All are wearing school uniform.

  • More from navigation items

Critical thinking in the lab (and beyond)

David Read

  • No comments

How to alter existing activities to foster scientific skills

Although many of us associate chemistry education with the laboratory, there remains a lack of evidence that correlates student learning with practical work. It is vital we continue to improve our understanding of how students learn from practical work, and we should devise methods that maximise the benefits. Jon-Marc Rodriguez and Marcy Towns, researchers at Purdue University, US, recently outlined an approach to modify existing practical activities to promote critical thinking in students, supporting enhanced learning. [1]

Although many of us associate chemistry education with the laboratory, there remains a lack of evidence that correlates student learning with practical work. It is vital we continue to improve our understanding of how students learn from practical work, and we should devise methods that maximise the benefits. Jon-Marc Rodriguez and Marcy Towns, researchers at Purdue University, US, recently outlined an approach to modify existing practical activities to promote critical thinking in students , supporting enhanced learning.

A picture of a wood grain desk, with two hands, one holding a piece of graph paper, the other drawing a curve onto the plotted graph

Source: © Science Photo Library

After an experiment, rather than asking a question, task students with plotting a graph; it’ll induce critical thinking and engagement with science practices

Jon-Marc and Marcy focused on critical thinking as a skill needed for successful engagement with the eight ‘science practices’. These practices come from a 2012 framework for science education published by the US National Research Council. The eight practices are: asking questions; developing and using models; planning and carrying out investigations; analysing and interpreting data; using mathematics and computational thinking; constructing explanations; engaging in argument from evidence; and obtaining, evaluating and communicating information. Such skills are widely viewed as integral to an effective chemistry programme. Practising scientists use multiple tools simultaneously when addressing a question, and well-designed practical activities that give students the opportunity to engage with numerous science practices will promote students’ scientific development.

The Purdue researchers chose to examine a traditional laboratory experiment on acid-base titrations because of its ubiquity in chemistry teaching. They characterised the pre- and post-lab questions associated with this experiment in terms of their alignment with the eight science practices. They found only two of ten pre- and post-lab questions elicited engagement with science practices, demonstrating the limitations of the traditional approach. Notably, the pre-lab questions included numerous calculations that were not considered to promote science practices-engagement. Students could answer the calculations algorithmically, with no consideration of the significance of their answer.

Next, Jon-Marc and Marcy modified the experiment and rewrote the pre- and post-lab questions in order to foster engagement with the science practices. They drew on recent research that recommends minimising the amount of information given to students and developing a general understanding of the underlying theory.  [2] The modified set of questions were fewer, with a greater emphasis on conceptual understanding. They questioned aspects such as the suitability of the method and the central question behind the experiment. Questions were more open and introduced greater scope for developing critical thinking.

Next, Jon-Marc and Marcy modified the experiment and rewrote the pre- and post-lab questions in order to foster engagement with the science practices. They drew on recent research that recommends minimising the amount of information given to students and developing a general understanding of the underlying theory. The modified set of questions were fewer, with a greater emphasis on conceptual understanding. They questioned aspects such as the suitability of the method and the central question behind the experiment. Questions were more open and introduced greater scope for developing critical thinking.

In taking an existing protocol and reframing it in terms of science practices, the authors demonstrate an approach instructors can use to adapt their existing activities to promote critical thinking. Using this approach, instructors do not have to spend excessive time creating new activities. Additionally, instructors will have the opportunity to research the impact of their approach on student learning in the teaching laboratory.

Teaching tips

Question phrasing and the steps students should go through to get an answer are instrumental in inducing critical thinking and engagement with science practices. As noted above, simple calculation-based questions do not prompt students to consider the significance of the values calculated. Questions should:

  • refer to an event, observation or phenomenon;
  • ask students to perform a calculation or demonstrate a relationship between variables;
  • ask students to provide a consequence or interpretation (not a restatement) in some form (eg a diagram or graph) based on their results, in the context of the event, observation or phenomenon.

This is more straightforward than it might first seem. The example question Jon-Marc and Marcy give requires students to calculate percentage errors for two titration techniques before discussing the relative accuracy of the methods. Students have to use their data to explain which method was more accurate, prompting a much higher level of engagement than a simple calculation.

As pre-lab preparation, ask students to consider an experimental procedure and then explain in a couple of sentences what methods are going to be used and the rationale for their use. As part of their pre-lab, the Purdue University research team asked students to devise a scientific (‘research’) question that could be answered using the data collected. They then asked students to evaluate and modify their own questions as part of the post-lab, supporting the development of investigative skills. It would be straightforward to incorporate this approach into any practical activity.

Finally, ask students to evaluate a mock response from another student about an aspect of the theory (eg ‘acids react with bases because acids like to donate protons and bases like to accept them’). This elicits critical thinking that can engage every student, with scope to stretch the more able.

These approaches can help students develop a more sophisticated view of chemistry and the higher order skills that will serve them well whatever their future destination.

[1] J-M G Rodriguez and M H Towns, J. Chem. Educ. , 2018, 95 , 2141, DOI: 10.1021/acs . jchemed.8b00683

[2] H Y Agustian and M K Seery, Chem. Educ. Res. Pract., 2017, 18 , 518, DOI: 10.1039/C7RP00140A

J-M G Rodriguez and M H Towns,  J. Chem. Educ. , 2018,  95 , 2141,  DOI: 10.1021/acs . jchemed.8b00683

H Y Agustian and M K Seery,  Chem. Educ. Res. Pract.,  2017,  18 , 518, DOI: 10.1039/C7RP00140A

David Read

More from David Read

Someone wearing a lab coat flexing their bicep

How building your subject knowledge bolsters your teaching confidence

Looking down at feet between forward and backward arrows on a street

3 ways to boost knowledge transfer and retention

A teacher and students all wearing VR headsets examine an electron shell orbital diagram in 3d

The science classroom of the future

  • Acids and bases
  • Education research
  • Evidence-based teaching
  • Secondary education

Related articles

Understanding how students untangle intermolecular forces.

2024-03-14T05:10:00Z By Fraser Scott

Discover how learners use electronegativity to predict the location of dipole−dipole interactions 

A woman pushing play on an oversized remote control.

Why I use video to teach chemistry concepts

2024-02-27T08:17:00Z By Helen Rogerson

Helen Rogerson shares why and when videos are useful in the chemistry classroom

Looking down at feet between forward and backward arrows on a street

2024-02-20T05:00:00Z By David Read

Apply these evidence-informed cognitive processes to ensure your learners get ahead

No comments yet

Only registered users can comment on this article., more from education research.

Cartoon of people negotiating a maze that includes the shapes of mathematical functions in an arrow.

Boost maths skills to improve chemistry learning

2024-01-18T08:00:00Z By Fraser Scott

Use these evidence-based tips to help your learners get ahead with chemical calculations

An illustration of a crowd of people in the dark with one person shining a torch to show brightly coloured shapes

Support student sensemaking through directed dialogue

2023-12-19T09:27:00Z By David Read

 Discover how to encourage effective classroom conversation to boost student understanding

A glass conical flask of green liquid in a lab setting

How to banish misconceptions with green chemistry

2023-11-23T05:00:00Z By Fraser Scott

Evidence-informed tips on how a greener approach can improve problem-solving abilities and address misconceptions

  • Contributors
  • Print issue
  • Email alerts

Site powered by Webvision Cloud

Book cover

Education in the 21st Century pp 29–47 Cite as

Fostering Students’ Creativity and Critical Thinking in Science Education

  • Stéphan Vincent-Lancrin 6  
  • First Online: 31 January 2022

915 Accesses

1 Citations

What does it mean to redesign teaching and learning within existing science curricula (and learning objectives) so that students have more space and appropriate tasks to develop their creative and critical thinking skills? The chapter begins by describing the development of a portfolio of rubrics on creativity and critical thinking, including a conceptual rubric on science tested in primary and secondary education in 11 countries. Teachers in school networks adopted teaching and learning strategies aligned to the development of creativity and critical thinking, to these OECD rubrics. Examples of lesson plans and pedagogies that were developed are given, and some key challenges for teachers and learners are reflected on.

  • Critical thinking
  • Science education
  • Innovation in education
  • Lesson plans

The analyses given and the opinions expressed in this chapter are those of the author and do not necessarily reflect the views of the OECD and of its members.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Many project-based science units/ courses initially develop “Driving Questions” to contextualise the unit and give learners opportunities to connect the unit to their own experiences and prior ideas.

Adler, L., Bayer, I., Peek-Brown, D., Lee, J., & Krajcik, J. (2017). What controls my health . https://www.oecd.org/education/What-Controls-My-Health.pdf

Davies, M. (2015). In R. Barnett (Ed.), The Palgrave handbook of critical thinking in higher education . Palgrave Macmillan.

Chapter   Google Scholar  

Dennett, D. C. (2013). Intuition pumps and other tools for thinking . England: Penguin.

Google Scholar  

Ennis, R. (1996). Critical thinking . Upper Saddle River, NJ: Prentice-Hall.

Ennis, R. (2018). Critical thinking across the curriculum: A vision. Topoi, 37 (1), 165–184. https://doi.org/10.1007/s11245-016-9401-4 .

Article   Google Scholar  

Facione, P.A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . Research findings and recommendations prepared for the Committee on Pre-College Philosophy of the American Philosophical Association. Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED315423

Feynman, R. (1963). The Feynman lectures on physics . (Volume I: The New Millennium Edition: Mainly Mechanics, Radiation, and Heat.). Basic Books.

Feynman, R. (1955). The value of science. In R. Leighton (Ed.), What do you care what other people think? Further adventures of a curious character (pp. 240–257). Penguin Books.

Fullan, M., Quinn, J., & McEachen, J. (2018). Deep learning: Engage the world, change the world . Corwin Press and Ontario Principals’ Council.

Guilford, J. P. (1950). Creativity. American Psychologist, 5 (9), 444–454. https://doi.org/10.1037/h0063487 .

Hitchcock, D. (2018). Critical thinking. In Zalta, E.N. (ed.), The Stanford encyclopedia of philosophy (Fall 2018 Edition). Retrieved from : https://plato.stanford.edu/archives/fall2018/entries/critical-thinking .

Kelley, T. (2001). The art of innovation: Lessons in creativity from IDEO . Currency: America’s leading design firm.

Lubart, T. (2000). Models of the creative process: Past, present and future. Creativity Research Journal, 13 (3–4), 295–308. https://doi.org/10.1207/S15326934CRJ1334_07 .

Lucas, B., Claxton, G., & Spencer, E. (2013). Progression in student creativity in school: First steps towards new forms of formative assessments. In OECD education working papers, 86 . Paris: OECD. https://doi.org/10.1787/5k4dp59msdwk-en .

Lucas, B., & Spencer, E. (2017). Teaching creative thinking: Developing learners who generate ideas and can think critically . England: Crown House Publishing.

McPeck, J. E. (1981). Critical thinking and education . New York: St. Martin’s.

Mednick, S. A. (1962). The associative basis of the creative process. Psychological Review, 69 (3), 220–232. https://doi.org/10.1037/h0048850 .

Newton, L. D., & Newton, D. P. (2014). Creativity in 21st century education. Prospects, 44 (4), 575–589. https://doi.org/10.1007/s11125-014-9322-1 .

Paddock, W., Erwin, S., Bielik, T., & Krajcik, J. (2019). Evaporative cooling . Retrieved from : https://www.oecd.org/education/Evaporative-Cooling.pdf

Rennie, L. (2020). Communicating certainty and uncertainty in science in out-of-school contexts. In D. Corrigan, C. Buntting, A. Jones, & A. Fitzgerald (Eds.), Values in science education: The shifting sands (pp. 7–30). Cham, Switzerland: Springer.

Runco, M. A. (2009). Critical thinking. In M. A. Runco & S. R. Pritzker (Eds.), Encyclopedia of creativity (pp. 449–452) . Academic.

Schneider, B., Krajcik, J., Lavonen, J., & Samela-Aro, K. (2020). Learning science: The value of crafting engagement in science environments . United States: Yale University.

Book   Google Scholar  

Sternberg, R. J., & Lubart, T. (1999). The concept of creativity: Prospects and paradigm. In R. J. Sternberg (Ed.), Handbook of creativity (pp. 3–14). England: Cambridge University.

Torrance, E. P. (1966). Torrance tests of creative thinking: Norms. Technical manual research edition; Verbal Tests, Forms A and B, Figural Tests, Forms A and B . Princeton, NJ: Personnel.

Torrance, E. P. (1970). Encouraging creativity in the classroom . United States: W.C. Brown.

Vincent-Lancrin, S., González-Sancho, C., Bouckaert, M., de Luca, F., Fernández-Barrerra, M., Jacotin, G., Urgel, J., & Vidal, Q. (2019). Fostering students’ creativity and critical thinking in education: What it means in school . Paris: OECD. https://doi.org/10.1787/62212c37-en .

Download references

Author information

Authors and affiliations.

Directorate for Education and Skills, Organisation for Economic Co-operation and Development, Paris, France

Stéphan Vincent-Lancrin

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Stéphan Vincent-Lancrin .

Editor information

Editors and affiliations.

Monash University, Clayton, VIC, Australia

Amanda Berry

University of Waikato, Hamilton, New Zealand

Cathy Buntting

Deborah Corrigan

Richard Gunstone

Alister Jones

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Cite this chapter.

Vincent-Lancrin, S. (2021). Fostering Students’ Creativity and Critical Thinking in Science Education. In: Berry, A., Buntting, C., Corrigan, D., Gunstone, R., Jones, A. (eds) Education in the 21st Century. Springer, Cham. https://doi.org/10.1007/978-3-030-85300-6_3

Download citation

DOI : https://doi.org/10.1007/978-3-030-85300-6_3

Published : 31 January 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-85299-3

Online ISBN : 978-3-030-85300-6

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

examples of critical thinking in science

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. Employers prioritize the ability to think critically—find out why, plus see how you can demonstrate that you have this ability throughout the job application process. 

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

 Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter, and during your interview.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Promote Your Skills in Your Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your  work history , include top critical thinking skills that accurately describe you. You can also include them in your  resume summary , if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand critical thinking skills in mind as you update your resume and write your cover letter. As you've seen, you can also emphasize them at other points throughout the application process, such as your interview. 

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with  analytical skills  can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns

Communication

Often, you will need to share your conclusions with your employers or with a group of colleagues. You need to be able to  communicate with others  to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing

Open-Mindedness

To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate that you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • How To Become an Effective Problem Solver
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • Types of Medical School Interviews and What to Expect
  • The Horse Problem: A Math Challenge
  • What to Do When the Technology Fails in Class
  • A Guide to Business Letters Types
  • Landing Your First Teaching Job
  • How to Facilitate Learning and Critical Thinking
  • Best Majors for Pre-med Students
  • Problem Solving in Mathematics
  • Discover Ideas Through Brainstorming
  • What You Need to Know About the Executive Assessment
  • Finding a Job for ESL Learners: Interview Basics
  • Finding a Job for ESL Learners
  • Job Interview Questions and Answers

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

  • Jason E. Dowd
  • Robert J. Thompson
  • Leslie A. Schiff
  • Julie A. Reynolds

*Address correspondence to: Jason E. Dowd ( E-mail Address: [email protected] ).

Department of Biology, Duke University, Durham, NC 27708

Search for more papers by this author

Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

ACKNOWLEDGMENTS

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . Google Scholar
  • August, D. ( 2016 ). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. Google Scholar
  • Beyer, C. H., Taylor, E., & Gillmore, G. M. ( 2013 ). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. Google Scholar
  • Bissell, A. N., & Lemons, P. P. ( 2006 ). A new method for assessing critical thinking in the classroom . BioScience , 56 (1), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . Google Scholar
  • Blattner, N. H., & Frazier, C. L. ( 2002 ). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , 8 (1), 47–64. Google Scholar
  • Clase, K. L., Gundlach, E., & Pelaez, N. J. ( 2010 ). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , 38 (5), 290–295. Medline ,  Google Scholar
  • Condon, W., & Kelly-Riley, D. ( 2004 ). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , 9 (1), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . Google Scholar
  • Ding, L., Wei, X., & Liu, X. ( 2016 ). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , 46 (5), 613–632. https://doi.org/10.1007/s11165-015-9473-y . Google Scholar
  • Dowd, J. E., Connolly, M. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015a ). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , 46 (1), 14–27. https://doi.org/10.1080/00220485.2014.978924 . Google Scholar
  • Dowd, J. E., Roy, C. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015b ). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , 92 (1), 39–45. https://doi.org/10.1021/ed500298r . Google Scholar
  • Dowd, J. E., Thompson, R. J.Jr., & Reynolds, J. A. ( 2016 ). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , 27 , 36–51. Google Scholar
  • Facione, P. A. ( 1990 ). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association. Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . Google Scholar
  • Gerdeman, R. D., Russell, A. A., Worden, K. J., Gerdeman, R. D., Russell, A. A., & Worden, K. J. ( 2007 ). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , 36 (5), 46–52. Google Scholar
  • Greenhoot, A. F., Semb, G., Colombo, J., & Schreiber, T. ( 2004 ). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , 18 (2), 203–221. https://doi.org/10.1002/acp.959 . Google Scholar
  • Haaga, D. A. F. ( 1993 ). Peer review of term papers in graduate psychology courses . Teaching of Psychology , 20 (1), 28–32. https://doi.org/10.1207/s15328023top2001_5 . Google Scholar
  • Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., … Whitlock, K. ( 2003 ). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , 30 (3), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . Google Scholar
  • Hand, B., & Keys, C. W. ( 1999 ). Inquiry investigation . Science Teacher , 66 (4), 27–29. Google Scholar
  • Holm, S. ( 1979 ). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , 6 (2), 65–70. Google Scholar
  • Holyoak, K. J., & Morrison, R. G. ( 2005 ). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. Google Scholar
  • Insight Assessment . ( 2016a ). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST . Google Scholar
  • Insight Assessment . ( 2016b ). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 . Google Scholar
  • Kelly, G. J., & Takao, A. ( 2002 ). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , 86 (3), 314–342. https://doi.org/10.1002/sce.10024 . Google Scholar
  • Kuhn, D., & Dean, D.Jr. ( 2004 ). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , 5 (2), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . Google Scholar
  • Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. ( 2008 ). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . Google Scholar
  • Lawson, A. E. ( 2010 ). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , 94 (2), 336–364. https://doi.org/­10.1002/sce.20357 . Google Scholar
  • Meizlish, D., LaVaque-Manty, D., Silver, N., & Kaplan, M. ( 2013 ). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson, R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. Google Scholar
  • Miri, B., David, B.-C., & Uri, Z. ( 2007 ). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , 37 (4), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . Google Scholar
  • Moshman, D. ( 2015 ). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. Google Scholar
  • National Research Council . ( 2000 ). How people learn: Brain, mind, experience, and school . Expanded ed.. Washington, DC: National Academies Press. Google Scholar
  • Pukkila, P. J. ( 2004 ). Introducing student inquiry in large introductory genetics classes . Genetics , 166 (1), 11–18. https://doi.org/10.1534/genetics.166.1.11 . Medline ,  Google Scholar
  • Quitadamo, I. J., Faiola, C. L., Johnson, J. E., & Kurtz, M. J. ( 2008 ). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , 7 (3), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . Link ,  Google Scholar
  • Quitadamo, I. J., & Kurtz, M. J. ( 2007 ). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , 6 (2), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . Link ,  Google Scholar
  • Reynolds, J. A., Smith, R., Moskovitz, C., & Sayle, A. ( 2009 ). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , 59 (10), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson, R. J. ( 2012 ). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , 11 (1), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . Link ,  Google Scholar
  • Reynolds, J. A., & Thompson, R. J. ( 2011 ). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , 10 (2), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . Link ,  Google Scholar
  • Rhemtulla, M., Brosseau-Liard, P. E., & Savalei, V. ( 2012 ). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , 17 (3), 354–373. https://doi.org/­10.1037/a0029315 . Medline ,  Google Scholar
  • Stephenson, N. S., & Sadler-McKnight, N. P. ( 2016 ). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , 17 (1), 72–79. https://doi.org/­10.1039/C5RP00102A . Google Scholar
  • Tariq, V. N., Stefani, L. A. J., Butcher, A. C., & Heylings, D. J. A. ( 1998 ). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , 23 (3), 221–240. https://doi.org/­10.1080/0260293980230301 . Google Scholar
  • Timmerman, B. E. C., Strickland, D. C., Johnson, R. L., & Payne, J. R. ( 2011 ). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , 36 (5), 509–547. https://doi.org/10.1080/­02602930903540991 . Google Scholar
  • Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. ( 2000 ). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , 25 (2), 149–169. https://doi.org/10.1080/713611428 . Google Scholar
  • Willison, J., & O’Regan, K. ( 2007 ). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , 26 (4), 393–409. https://doi.org/10.1080/07294360701658609 . Google Scholar
  • Woodin, T., Carter, V. C., & Fletcher, L. ( 2010 ). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , 9 (2), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . Link ,  Google Scholar
  • Zeineddin, A., & Abd-El-Khalick, F. ( 2010 ). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , 47 (9), 1064–1093. https://doi.org/10.1002/tea.20368 . Google Scholar
  • Zimmerman, C. ( 2000 ). The development of scientific reasoning skills . Developmental Review , 20 (1), 99–149. https://doi.org/10.1006/drev.1999.0497 . Google Scholar
  • Zimmerman, C. ( 2007 ). The development of scientific thinking skills in elementary and middle school . Developmental Review , 27 (2), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . Google Scholar
  • Gender, Equity, and Science Writing: Examining Differences in Undergraduate Life Science Majors’ Attitudes toward Writing Lab Reports 6 March 2024 | Education Sciences, Vol. 14, No. 3
  • Designing a framework to improve critical reflection writing in teacher education using action research 24 February 2022 | Educational Action Research, Vol. 32, No. 1
  • Scientific Thinking and Critical Thinking in Science Education  5 September 2023 | Science & Education, Vol. 11
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Critical thinking during science investigations: what do practicing teachers value and observe? 16 March 2023 | Teachers and Teaching, Vol. 29, No. 6
  • Effect of Web-Based Collaborative Learning Method with Scratch on Critical Thinking Skills of 5th Grade Students 30 March 2023 | Participatory Educational Research, Vol. 10, No. 2
  • Are We on the Way to Successfully Educating Future Citizens?—A Spotlight on Critical Thinking Skills and Beliefs about the Nature of Science among Pre-Service Biology Teachers in Germany 22 March 2023 | Behavioral Sciences, Vol. 13, No. 3
  • A Systematic Review on Inquiry-Based Writing Instruction in Tertiary Settings 30 November 2022 | Written Communication, Vol. 40, No. 1
  • An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills 8 December 2022 | SN Social Sciences, Vol. 2, No. 12
  • TEACHING OF CRITICAL THINKING SKILLS BY SCIENCE TEACHERS IN JAPANESE PRIMARY SCHOOLS 25 October 2022 | Journal of Baltic Science Education, Vol. 21, No. 5
  • A Team-Based Activity to Support Knowledge Transfer and Experimental Design Skills of Undergraduate Science Students 4 May 2022 | Journal of Microbiology & Biology Education, Vol. 21
  • Curriculum Design of College Students’ English Critical Ability in the Internet Age Wireless Communications and Mobile Computing, Vol. 2022
  • Exploring the structure of students’ scientific higher order thinking in science education Thinking Skills and Creativity, Vol. 43
  • The Asia-Pacific Education Researcher, Vol. 31, No. 4
  • Conspiratorial Beliefs and Cognitive Styles: An Integrated Look on Analytic Thinking, Critical Thinking, and Scientific Reasoning in Relation to (Dis)trust in Conspiracy Theories 12 October 2021 | Frontiers in Psychology, Vol. 12
  • Professional Knowledge and Self-Efficacy Expectations of Pre-Service Teachers Regarding Scientific Reasoning and Diagnostics 11 October 2021 | Education Sciences, Vol. 11, No. 10
  • Developing textbook based on scientific approach, critical thinking, and science process skills Journal of Physics: Conference Series, Vol. 1839, No. 1
  • Using Models of Cognitive Development to Design College Learning Experiences
  • Thinking Skills and Creativity, Vol. 42
  • Assessing students’ prior knowledge on critical thinking skills in the biology classroom: Has it already been good?
  • Critical Thinking Level among Medical Sciences Students in Iran Education Research International, Vol. 2020
  • Teaching during a pandemic: Using high‐impact writing assignments to balance rigor, engagement, flexibility, and workload 12 October 2020 | Ecology and Evolution, Vol. 10, No. 22
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • Developing critical thinking skills assessment for pre-service elementary school teacher about the basic concept of science: validity and reliability Journal of Physics: Conference Series, Vol. 1567, No. 2
  • Challenging endocrinology students with a critical-thinking workbook Advances in Physiology Education, Vol. 44, No. 1
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Rebecca Price, Monitoring Editor
  • Kari L. Nelson ,
  • Claudia M. Rauter , and
  • Christine E. Cutucache
  • Elisabeth Schussler, Monitoring Editor

Submitted: 17 March 2017 Revised: 19 October 2017 Accepted: 20 October 2017

© 2018 J. E. Dowd et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

What influences students’ abilities to critically evaluate scientific investigations?

Ashley b. heim.

1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

David Esparza

Michelle k. smith, n. g. holmes, associated data.

All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

Instrument description

Question types.

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

The Eco-BLIC examples are derived from the owl/mouse scenario.

Instrument versions

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g001.jpg

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g002.jpg

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g003.jpg

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g004.jpg

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix, s2 appendix, s3 appendix, acknowledgments.

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

Funding Statement

This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

  • Student Login
  • Educator Login
  • Research Article

What is CER in Science & Why It’s Essential for Student Success

Teaching Strategies

The Power of CER (Claim, Evidence, Reasoning) in Science

Ask any teacher. Every educator hopes to inspire students to be critical thinkers. That skill benefits learners in every content area, especially science. It’s also a skill that needs to be taught and reinforced. That’s why CER is so powerful.

What is CER, and why does it matter in scientific inquiry?

Science classes are all about  inquiry and investigation . Through scientific inquiry, learners understand the meaning of CER.

What does CER stand for? CER is Claim, Evidence, and Reasoning. It’s a three-step process that helps students develop  critical thinking skills in science class. Students stake a claim by answering a question that they need to prove. The evidence comes from demonstrating understanding and proof of that answer. Reasoning brings in the explanation piece. Students explain and support their thinking.

Claim: Statement that answers the question is based on evidence

CER examples: Real-world applications in science

Start with any question. Then, ask students to explain their answers using the claim, evidence, and reasoning process.

Here are some CER examples to use with your students:

  • Does cracking your knuckles lead to arthritis?
  • How often do people blink?
  • Which candy color (M&M, Skittles, etc.) is in the bag most often?
  • Does a person’s height change throughout the day?
  • What is the most popular sport in the United States?
  • What is the most popular app of the year?

Ask your students to brainstorm some questions of their own!

Why claim, evidence, and reasoning are essential for developing critical thinking skills in students

CER trains students to think like scientists. It is an effective way to frame the workflow of questioning through experimentation, evidence, and explanation. Students begin to think about how evidence supports their claims and how to explain their findings. All of these are key components of critical thinking.

Stimulate scientific minds with CER activities in the classroom

Because science is the "study of the structure and behavior of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained," according to Dictionary.com, science classes offer countless opportunities for CER activities with students.  ExploreLearning Gizmos and STEM Cases are designed with this in mind.

Gizmos effectively use CER with whole and small group instruction and independent study. Students can answer questions about disease transmission in the Disease Spread Gizmo. What about ways to tell if a chemical change has occurred? Students can explore with the Chemical Changes Gizmo.

Disease Spread Gizmo

Chemical Changes Gizmo

STEM Cases are interactive case studies that place students in the role of a STEM professional whose task is to solve a real-world problem, which is perfect for CER. How about investigating Fruit Production and the Environment or Animal Group Behavior ?

Fruit Production and the Environment STEM Case

Animal Behavior STEM Case

What about the youngest scientists? It’s important to focus on science in the early years , too. Science4Us has lessons to start them on the right path for scientific inquiry and critical thinking using CER. Let them experience scientific inquiry with lessons about force and motion or living things .

Hoping for more information and guidance for using CER in your classroom? We’ve got you covered with professional development options.

Grade 3-12 Grade Teachers     K-2 Teachers

How about a trial for more CER options?

Make chemistry come alive for your students in grades 3+ with Gizmos virtual simulations.

Let the youngest learners experiment with chemistry through these experiments featured in Science4Us .

You might also like these stories...

Five Examples of Critical Thinking Skills

critical thinking examples

What critical thinking examples can we give to clarify the meaning of critical thinking? This article defines and provides five examples of critical thinking skills in various fields or disciplines.

Table of Contents

Introduction.

In teaching students, we usually use the word critical thinking for them to think on a higher level, as described in Bloom’s Taxonomy’s Categories in the Cognitive Domain. We call this the Higher Order Thinking Skills or HOTS.

But how is critical thinking skill shown? What should we look out for among other students that we can consider as demonstrative or indicator that they thought critically?

I clarify this sometimes vague concept that is always mentioned but not specifically applied during the delivery of lessons or courses. As teachers or mentors, this concept must be crystal clear in our minds, so that we can assess such a demonstration of critical thinking that we can incorporate in our rubrics.

Alright. Let’s proceed by defining first what is critical thinking. I will then proceed by giving five critical thinking examples in different disciplines.

Definition of Critical Thinking and Its Importance

Critical thinking is a crucial skill that plays a significant role in education. It involves the ability to analyze, evaluate, and interpret information logically.

Critical thinking is the ability to analyze, evaluate, and interpret information logically.

By encouraging critical thinking, educators aim to develop students’ problem-solving abilities, enhance their decision-making skills, and foster independent and creative thinking.

In today’s rapidly changing world, where information is readily available and constantly evolving, critical thinking has become even more essential. It enables individuals to navigate through the vast amount of information, distinguish between reliable and unreliable sources, and make informed judgments.

Critical thinking helps students develop a deeper understanding of the subjects they study, as they learn to question assumptions, challenge existing knowledge, and explore alternative perspectives.

By incorporating critical thinking into education, students are better equipped to face real-world challenges. They become more adaptable, open-minded, and capable of making well-reasoned decisions.

Critical thinking also promotes effective communication and collaboration, as students learn to articulate their thoughts, listen to others’ viewpoints, and engage in constructive discussions.

In the following sections, we will explore five examples of critical thinking across disciplines, including environmental science, statistics, engineering, science, and humanities. Each example will highlight how we can improve critical thinking skills through specific teaching strategies.

Critical Thinking Examples Across 5 Disciplines

In this section, we will explore five critical thinking examples across different disciplines, including environmental science, statistics, engineering, science, and humanities. Each example will highlight how we can improve critical thinking skills through specific teaching strategies .

1. Environmental Science

One example of critical thinking in environmental science is analyzing the impact of human activities on ecosystems. By teaching students to evaluate the consequences of actions such as deforestation or pollution, they can develop a deeper understanding of the interconnectedness of the environment.

Engaging students in hands-on experiments about pollution , fieldwork, and case studies can enhance their critical thinking skills by encouraging them to question assumptions, consider alternative solutions, and evaluate the long-term effects of human actions.

For instance, in a classroom setting, we can present students with a case study on the effects of deforestation on a specific ecosystem. We can then ask them to analyze the data, identify the underlying causes, and propose sustainable solutions.

By doing so, we encourage students to think critically about the complex relationship between human activities and the environment, considering both short-term and long-term consequences.

2. Statistics

Critical thinking in statistics involves interpreting and analyzing data to make informed decisions. Teaching students to question the validity of data sources, identify biases, and analyze statistical methods can improve their critical thinking skills.

Incorporating real-world examples, interactive data analysis exercises, and group discussions can enhance students’ ability to evaluate the reliability of statistical information and draw accurate conclusions.

For example, we can give students a dataset and ask them to evaluate critically the method or methodology used to collect the data, identify any potential biases, and draw meaningful conclusions.

By engaging in group discussions, students can compare their findings, challenge each other’s assumptions, and develop a deeper understanding of the limitations and strengths of statistical analysis .

3. Engineering

Critical thinking in engineering involves problem-solving and innovation. By presenting students with complex engineering challenges, educators can foster critical thinking skills.

Encouraging students to brainstorm, analyze constraints, and propose creative solutions can enhance their ability to think critically. Incorporating project-based learning, teamwork, and hands-on experiments can further develop their critical thinking skills in the engineering field.

For instance, we can task students with designing and building a prototype to solve a specific engineering problem. Throughout the process, they are required to think critically about the constraints, consider alternative approaches, and evaluate the feasibility of their solutions.

By working collaboratively in teams, students can also learn from each other’s perspectives and develop a more comprehensive understanding of the problem at hand.

Critical thinking in science involves questioning existing theories, designing experiments, and analyzing results. By teaching students to challenge assumptions, evaluate evidence, and draw logical conclusions, educators can enhance their critical thinking skills.

Engaging students in scientific inquiry, encouraging them to develop hypotheses, and providing opportunities for peer review and scientific debate can further improve their ability to think critically.

For example, we can give students a scientific research paper and have them critically evaluate the method or methodology , analyze the results, and draw conclusions based on the evidence presented.

By engaging in peer review and scientific debate, students can refine their critical thinking skills by challenging each other’s interpretations, identifying potential flaws in the research, and proposing alternative explanations.

5. Humanities

Critical thinking in humanities involves analyzing and interpreting texts, artworks, and historical events. By teaching students to question biases, analyze multiple perspectives, and evaluate evidence, educators can enhance their critical thinking skills. Incorporating class discussions, debates, and critical analysis of primary and secondary sources can further develop students’ ability to think critically in the humanities.

For instance, we can assign students a historical event and request them to analyze primary and secondary sources critically, in order to gain a deeper understanding of the event from multiple perspectives.

By engaging in class discussions and debates, students can develop their critical thinking skills by challenging prevailing narratives, questioning biases, and evaluating the reliability of different sources.

By exploring these five examples, we can see that specific teaching strategies in various disciplines can improve critical thinking skills. These examples show the importance of incorporating critical thinking into education to equip students with the skills necessary to navigate complex challenges and make informed decisions.

Conclusions and Recommendations

Based on the discussion in the previous section, critical thinking skills are essential across various disciplines. To effectively develop these skills, educators should employ specific teaching strategies that encourage students to think critically.

In conclusion, to develop critical thinking skills, educators should employ teaching strategies as shown in the five critical thinking examples, such as hands-on experiments, real-world examples, project-based learning, and critical analysis. By incorporating these strategies, students can navigate complex challenges, make informed decisions, and become critical thinkers in their respective fields.

Related Posts

Awesome Kadayawan Festival in Davao: 13 Things to Do

Awesome Kadayawan Festival in Davao: 13 Things to Do

K12 Education in the Philippines: Ineffective?

K12 Education in the Philippines: Ineffective?

Thesis writing: 9 tips on how to write the results and discussion, about the author, patrick regoniel.

Dr. Regoniel, a faculty member of the graduate school, served as consultant to various environmental research and development projects covering issues and concerns on climate change, coral reef resources and management, economic valuation of environmental and natural resources, mining, and waste management and pollution. He has extensive experience on applied statistics, systems modelling and analysis, an avid practitioner of LaTeX, and a multidisciplinary web developer. He leverages pioneering AI-powered content creation tools to produce unique and comprehensive articles in this website.

helpful professor logo

25 Critical Thinking Examples

critical thinking examples and definition, explained below

Critical thinking is the ability to analyze information and make reasoned decisions. It involves suspended judgment, open-mindedness, and clarity of thought.

It involves considering different viewpoints and weighing evidence carefully. It is essential for solving complex problems and making good decisions.

People who think critically are able to see the world in a more nuanced way and understand the interconnectedness of things. They are also better able to adapt to change and handle uncertainty.

In today’s fast-paced world, the ability to think critically is more important than ever and necessary for students and employees alike.

Critical Thinking Examples

1. identifying strengths and weaknesses.

Critical thinkers don’t just take things at face value. They stand back and contemplate the potential strengths and weaknesses of something and then make a decision after contemplation.

This helps you to avoid excessive bias and identify possible problems ahead of time.

For example, a boxer about to get in the ring will likely need to evaluate the strengths and weaknesses of his opponent. He might learn that his opponent’s left hook is very strong, but his opponent also gets tired after the third round. With this knowledge, he can go into the bout with strong defenses in the first three rounds before going on the offense.

Here, the boxer’s critical thinking skills will help him win his match.

2. Creating a Hypothesis based on Limited Data

When scientists set out to test a new theory, they first need to develop a hypothesis. This is an educated guess about how things work, based on what is already known.

Once a hypothesis has been developed, experiments can be designed to test it.

However, sometimes scientists may find themselves working with limited data. In such cases, they may need to make some assumptions in order to form a hypothesis.

For example, if they are studying a phenomenon that occurs infrequently, they may need to extrapolate from the data they do have in order to form a hypothesis.

Here, the scientist is engaged in critical thinking: they use the limited data to come up with a tentative judgment.

3. Moderating a Debate

A debate moderator needs to have strong critical thinking skills. They need to use objective evaluations, analysis, and critique to keep the discussion on track and ensure that all sides are heard fairly.

This means being able to identify when a point has been made sufficiently, or when someone is beginning to veer off topic and being able to direct the conversation accordingly.

Similarly, they need to be able to assess each argument objectively and consider its merits, rather than getting caught up in the emotion of the debate. If someone is using an unfair point or one that is not factual, the moderator needs to be switched on and identify this.

By remaining calm and impartial, the moderator can help to ensure that a debate is productive and respectful.

4. Judging and Adjudicating

A judge or adjudicator needs to weigh the evidence and make a determination based on the facts.

This requires the adjudicator to be able to try to see both sides of an argument. They need the ability to see past personal biases and to critically evaluate the credibility of all sides.

In addition, judges and adjudicators must be able to think quickly and make sound decisions in the face of complex issues.

For example, if you were to be adjudicating the above debate, you need to hear both sides of the argument and then decide who won. It’s your job to evaluate, see strengths and weaknesses in arguments, and come to a conclusion.

5. Grading an Essay

Teachers need critical thinking skills when grading essays so that they can effectively assess the quality of the writing. By critically analyzing the essay, teachers can identify any errors or weaknesses in the argument.

Furthermore, they can also determine whether the essay meets the required standards for the assignment. Even a very well-written essay may deserve a lower grade if the essay doesn’t directly answer the essay question.

A teacher needs to be able to read an essay and understand not only what the student is trying to say, but also how well they are making their argument. Are they using evidence effectively? Are they drawing valid conclusions? A teacher needs to be able to evaluate an essay holistically in order to give a fair grade.

In order to properly evaluate an essay, teachers need to be able to think critically about the writing. Only then can they provide an accurate assessment of the work.

6. Active Reading

Active reading is a skill that requires the reader to be engaged with the text in order to fully understand it. This means not only being able to read the words on the page, but also being able to interpret the meaning behind them.

In order to do this, active readers need to have good critical thinking skills.

They need to be able to ask questions about the text and look for evidence to support their answers. Additionally, active readers need to be able to make connections between the text and their own experiences.

Active reading leads to better comprehension and retention of information.

7. Deciding Whether or Not to Believe Something

When trying to determine whether or not to believe something, you’re engaging in critical thinking.

For example, you might need to consider the source of the information. If the information comes from a reliable source, such as a reputable news organization or a trusted friend, then it is more likely to be accurate.

However, if the source is less reliable, such as an anonymous website or a person with a known bias, then the information should be viewed with more skepticism.

In addition, it is important to consider the evidence that is being presented. If the evidence is well-supported and logically presented, then it is more likely to be true. However, if the evidence is weak or relies on fallacious reasoning, then the claim is less likely to be true.

8. Determining the Best Solution to a Situation

Determining the best solution to a problem generally requires you to critique the different options. There are often many different factors to consider, and it can be difficult to know where to start.

However, there are some general guidelines that can help to make the process a little easier.

For example, if you have a few possible solutions to the problem, it is important to weigh the pros and cons of each one. Consider both the short-term and long-term effects of each option before making a decision.

Furthermore, it is important to be aware of your own biases. Be sure to consider all of the options objectively, without letting your personal preferences get in the way.

9. Giving Formative Feedback

Formative feedback is feedback that you give to someone part-way through a learning experience. To do this, you need to think critically.

For example, one thing you need to do is see where the student’s strengths and weaknesses like. Perhaps the student is doing extremely well at a task, so your feedback might be that they should try to extend themselves by adding more complexity to the task.

Or, perhaps the student is struggling, so you suggest to them that they approach the learning experience from a different angle.

10. Giving Summative Feedback

Summative feedback occurs at the end of a learning scenario. For example, the written feedback at the end of an essay or on a report card is summative.

When providing summative feedback, it is important to take a step back and consider the situation from multiple perspectives. What are areas for improvement and where exactly might the student have missed some key points? How could the student have done better?

Asking yourself these questions is all part of the process of giving feedback, and they can all be considered examples of critical thinking. You’re literally critiquing the student’s work and identifying opportunities for improvement.

11. Evaluating Evidence

When evaluating evidence, critical thinkers take a step back and look at the bigger picture. They consider all of the available information and weigh it up. They look at logical flaws, the reliability of the evidence, and its validity.

This process allows them to arrive at a conclusion that is based on sound reasoning, rather than emotion or personal bias.

For example, when a social scientist looks at the evidence from his study, he needs to evaluate whether the data was corrupted and ensure the methodology was sound in order to determine if the evidence is valuable or not.

12. Media Literacy

Media literacy seems to be in short supply these days. Too many people take information off the internet or television and just assume it is true.

A person with media literacy, however, will not just trust what they see and read. Instead, they look at the data and weigh up the evidence. They will see if there was a sound study to back up claims. They will see if there is bias in the media source and whether it’s just following an ideological line.

Furthermore, they will make sure they seek out trustworthy media sources. These are not just media sources you like or that confirm your own point of view. They need to be sources that do their own research, find solid data, and don’t pursue one blind agenda.

13. Asking your Own Questions

Asking your own questions is an important part of critical thinking. When you ask questions, you are forcing yourself to think more deeply about the information you are considering.

Asking questions also allows you to gather more information from others who may have different perspectives.

This helps you to better understand the issue and to come up with your own conclusions.

So, often at schools, we give students a list of questions to ask about something in order to dig deeper into it. For example, in a book review lesson, the teacher might give a list of questions to ask about the book’s characters and plot.

14. Conducting Rigorous Research

Research is a process of inquiry that encompasses the gathering of data, interpretation of findings, and communication of results. The researcher needs to engage in critical thinking throughout the process, but most importantly, when designing their methodology.

Research can be done through a variety of methods, such as experiments, surveys, interviews, and observations. Each method has strengths and weaknesses.

Once the data has been collected, it must be analyzed and interpreted. This is often done through statistical methods or qualitative analysis.

Research is an essential tool for discovering new knowledge and for solving problems, but researchers need to think critically about how valid and reliable their data truly is.

15. Examining your own Beliefs and Prejudices

It’s important to examine your own beliefs and prejudices in order to ensure that they are fair and accurate. People who don’t examine their own beliefs have not truly critically examined their lives.

One way to do this is to take the time to consider why you believe what you do. What experiences have you had that have led you to this belief? Are there other ways to interpret these experiences? It’s also important to be aware of the potential for confirmation bias , which is when we seek out information that confirms our existing beliefs, while ignoring information that contradicts them.

This can lead us to hold onto inaccurate or unfair beliefs even when presented with evidence to the contrary.

To avoid this, it’s important to seek out diverse perspectives, and to be open-minded when considering new information. By taking these steps, you can help ensure that your beliefs are fair and accurate.

16. Looking at a Situation from Multiple Perspectives

One of the most important critical thinking skills that you can learn in life is how to look at a situation from multiple perspectives.

Being able to see things from different angles can help you to understand complex issues, spot potential problems, and find creative solutions. It can also help you to build better relationships, as you will be able to see where others are coming from and find common ground.

There are a few simple techniques that you can use to develop this skill.

First, try to imagine how someone else would feel in the same situation.

Second, put yourself in their shoes and try to see things from their point of view.

Finally, ask yourself what other factors may be influencing their perspective. By taking the time to view things from multiple angles, you will be better prepared to deal with whatever life throws your way.

17. Considering Implications before Taking Action

When faced with a difficult decision, it is important to consider the implications of each possible action before settling on a course of action.

This is because the consequences of our actions can be far-reaching and often unforeseen.

For example, a seemingly small decision like whether to attend a party or not might have much larger implications. If we decide to go to the party, we might miss an important deadline at work.

However, if we stay home, we might miss out on an opportunity to meet new people and make valuable connections.

In either case, our choice can have a significant impact on our lives.

Fortunately, critical thinking can help people to make well-informed decisions that could have a positive impact on their lives.

For example, you might have to weight up the pros and cons of attending the party and identify potential downsides, like whether you might be in a car with an impaired driver, and whether the party is really worth losing your job.

Having weighed up the potential outcomes, you can make a more rational and informed decision.

18. Reflective Practice

Reflecting on your actions is an important part of critical thinking. When you take the time to reflect, you are able to step back and examine your choices and their consequences more objectively.

This allows you to learn from your mistakes and make better decisions in the future.

In order to reflect effectively, it is important to be honest with yourself and open to learning new things. You must also be willing to question your own beliefs and assumptions. By taking these steps, you can develop the critical thinking skills that are essential for making sound decisions next time.

This will also, fortunately, help you to constantly improve upon yourself.

19. Problem-Solving

Problem-solving requires the ability to think critically in order to accurately assess a situation and determine the best course of action.

This means being able to identify the root cause of a problem , as well as any potential obstacles that may stand in the way of a solution. It also involves breaking down a problem into smaller, more manageable pieces in order to more easily find a workable solution.

In addition, critical thinking skills also require the ability to think creatively in order to come up with original solutions to these problems.

Go Deeper: Problem-Solving Examples

20. Brainstorming New Solutions

When brainstorming new solutions , critical thinking skills are essential in order to generate fresh ideas and identify potential issues.

For example, the ability to identify the problems with the last solution you tried is important in order to come up with better solutions this time. Similarly, analytical thinking is necessary in order to evaluate the feasibility of each idea. Furthermore, it is also necessary to consider different perspectives and adapt to changing circumstances.

By utilizing all of these critical thinking skills, it will be possible to develop innovative solutions that are both practical and effective.

21. Reserving Judgment

A key part of critical thinking is reserving judgment. This means that we should not rush to conclusions, but instead take the time to consider all the evidence before making up our minds.

By reserving judgment, we can avoid making premature decisions that we might later regret. We can also avoid falling victim to confirmation bias, which is the tendency to only pay attention to information that supports our existing beliefs.

Instead, by keeping an open mind and considering all the evidence, we can make better decisions and reach more accurate conclusions.

22. Identifying Deceit

Critical thinking is an important skill to have in any situation, but it is especially important when trying to identify deceit.

There are a few key things to look for when using critical thinking to identify deceit.

First, pay attention to the person’s body language. Second, listen closely to what the person is saying and look for any inconsistencies. Finally, try to get a sense of the person’s motive – why would they want to deceive you?

Each of these questions helps you to not just take things at their face value. Instead, you’re critiquing the situation and coming to a conclusion using all of your intellect and senses, rather than just believing what you’re told.

23. Being Open-Minded to New Evidence that Contradicts your Beliefs

People with critical thinking skills are more open-minded because they are willing to consider different points of view and evidence.

They also realize that their own beliefs may be wrong and are willing to change their minds if new information is presented.

Similarly, people who are not critical thinkers tend to be close-minded because they fail to critique themselves and challenge their own mindset. This can lead to conflicts, as closed-minded people are not willing to budge on their beliefs even when presented with contradictory evidence.

Critical thinkers, on the other hand, are able to have more productive conversations as they are willing to listen to others and consider different viewpoints. Ultimately, being open-minded and willing to change one’s mind is a sign of intelligence and maturity.

24. Accounting for Bias

We all have biases, based on our individual experiences, perspectives, and beliefs. These can lead us to see the world in a certain way and to interpret information in a way that supports our existing views.

However, if we want to truly understand an issue, it is important to try to put aside our personal biases and look at the evidence objectively.

This is where critical thinking skills come in.

By using critical thinking, we can examine the evidence dispassionately and assess different arguments without letting our own prejudices get in the way. Start by looking at weaknesses and logical flaws in your own thinking.

Play the devil’s advocate.

In this way, you can start to get a more accurate picture of an issue and make more informed decisions.

25. Basing your Beliefs on Logic and Reasoning

In order to lead a successful and fulfilling life, it is important to base your beliefs on logic and reasoning.

This does not mean that you should never believe in something without evidence, but it does mean that you should be thoughtful and intentional about the things that you choose to believe.

One way to ensure that your beliefs are based on logic and reasoning is to seek out reliable sources of information. Another method is to use thought games to follow all your thoughts to their logical conclusions.

By basing your beliefs on logic and reasoning, you will be more likely to make sound decisions, and less likely to be swayed by emotions or misinformation.

Critical thinking is an important skill for anyone who wants to be successful in the modern world. It allows us to evaluate information and make reasoned decisions, rather than simply accepting things at face value. 

Thus, employers often want to employ people with strong critical thinking skills. These employees will be able to solve problems by themselves and identify ways to improve the workplace. They will be able to push back against bad decisions and use their own minds to make good decisions.

Furthermore, critical thinking skills are important for students. This is because they need to be able to evaluate information and think through problems with a critical mindset in order to learn and improve.

Chris

Chris Drew (PhD)

Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]

  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 5 Top Tips for Succeeding at University
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 50 Durable Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 100 Consumer Goods Examples
  • Chris Drew (PhD) https://helpfulprofessor.com/author/chris-drew-phd/ 30 Globalization Pros and Cons

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

Change location

  • Connecticut
  • District of Columbia
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Puerto Rico
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Contact Sales

Pseudoscience examples for critical thinking skills

Snake oil, grapefruit diets, flat-earth theories—pseudoscience is something to be ignored, right? Not in science class! Studying pseudoscience is actually a great way to help students think like scientists, not to mention savvy citizens. We guarantee it, or your money back! (Kidding.)

examples of critical thinking in science

MIRACLE HAIR GROWTH! 

Quantum hair activation technology: This groundbreaking innovation goes beyond conventional science, delving into the realm of quantum energy to stimulate hair growth at the subatomic level. Blended with rare botanicals from ancient civilizations for luster and shine. Limited-time offer: Act now and receive a vial of stardust-infused hair serum!

Effective product…or pseudoscience ? We’ll bet you guessed it. (Sorry, no stardust serum for you!)

While this hair product itself sounds like junk, reading about it can be a valuable experience for science students.

Teaching your students to identify pseudoscience in the world around them helps them learn to protect themselves from false claims that can be money-wasting at best, dangerous at worst.

And as they learn to discern , they also develop lifelong critical thinking skills!

“We say knowledge is power but it's not enough to know things, and there's too much to know. Being able to think and not fall for someone's bunk is my goal for my students.”  —Melanie Trecek-King, biology professor and guest in Science Connections podcast Season 3, Episode 5: Thinking is power

Let’s explore how educators can use examples of pseudoscience to develop critical thinking skills—and incorporate NGSS (Next Generation Science Standards) science and engineering practices into their approach.

What’s the difference between science and pseudoscience?

Science is grounded in empirical evidence, rigorous testing, and the scientific method. Pseudoscience presents itself as scientific but lacks the fundamental elements of genuine scientific inquiry: evidence, peer review, and the capacity to generate accurate predictions.

Though pseudoscience may make vague claims, it has clear characteristics. When something is pseudoscience, it:

  • Can’t be proven wrong: Makes claims that are unobservable or too vague.
  • Professes “proof” without presenting actual evidence: Presents only anecdotal evidence, if any.
  • Uses technobabble: See: “Quantum hair activation technology.”

For more characteristics of pseudoscience, check out Melanie Trecek-King ’s episode of Science Connections!

To be sure, not all pseudoscience is harmful—pursuits and activities such as aromatherapy and astrology can be positive experiences in people’s lives—it just should not be defined as or considered science.

How addressing pseudoscience encourages critical thinking

When you teach students to identify pseudoscience, you are teaching them to use an evidence- and research-based approach when analyzing claims. Which is…science!

You are also:

  • Teaching them to engage in thoughtful and educational argument/debate.
  • Encouraging them to use their knowledge of science in the real world.
  • Creating real-world impact.

When students learn to identify pseudoscience—faulty products, myths, and disprovable “discoveries”—they’ll be prepared and informed when making real-world decisions.

Critical thinking exercises inspired by pseudoscience

We’ve talked about “miracle” hair growth treatments, which are more commonly targeted to adults. Students may have more commonly encountered claims about or ads for alkaline water or detox diets , conspiracy theories and instances of science denial, astrology, and more. These examples offer great opportunities to discuss how to determine the difference between science and pseudoscience.

Suggested activities:

  • Pseudoscience Sherlock: Ask students to find examples of pseudoscience in real life via social media, products sold in stores, or on the internet. Tell them to pay close attention to “articles” that are really ads.
  • Pseudoscience lab: Prompt students to back up their claim that a given example represents pseudoscience with evidence: e.g., lack of empirical evidence, controlled experiments, or unbiased sample; absence of peer-reviewed research; reliance on anecdotes; hyperbolic and unprovable claims.
  • Snake oil! Ask students to practice identifying pseudoscience by creating their own advertisements, commercials, or news segments for fake products or scientific “advancements.”
  • Spread the word: Ask students to create flyers, PSAs, or articles on how to identify the characteristics of pseudoscience.

Other activities that incorporate the NGSS while also sniffing out pseudoscience:

  • Asking questions: Encourage students to ask probing questions about pseudoscientific claims. How does this claim defy our current understanding of the natural world? What empirical evidence is missing?
  • Developing and using models: Have students create models that illustrate the differences between a pseudoscientific claim and a well-established scientific concept. This visual representation supports understanding and critical analysis.
  • Engaging in argument from evidence: Arrange debates where students argue for or against a pseudoscientific claim using evidence-based reasoning. This practice sharpens their ability to critically evaluate information.
  • Obtaining, evaluating, and communicating information: Ask students to research the history and impact of a specific pseudoscientific belief. Have them present their findings, highlighting how critical thinking could have prevented widespread acceptance of the claim.

Using examples of pseudoscience in your science classroom can help students learn to not only think like scientists, but navigate the real world, too.

Bertha Vasquez, former teacher and current director of education at the Center for Inquiry, has used these approaches with her students. As she shared on Season 3, Episode 6 of Science Connections : “I guarantee you that those students, when they walked into a store with their parents and they saw a product [with] a money-back guarantee [that] cures way too many things, and it’s based on ‘ancient plant wisdom’ and has ‘scientific’ language on the box, they may go, ‘Mom, I think these people are trying to sell you some pseudoscience.’”

More to explore

Science connections.

  • Season 3, Episode 5: Thinking is power
  • Season 3, Episode 6: Identifying and addressing pseudoscience

Back-to-school science toolkit for administrators, teachers, and caregivers

Back-to-school science toolkit for administrators, teachers, and caregivers

Related resources.

examples of critical thinking in science

Science Connections is a podcast created for K—8 science educators. Along with guests, host, Eric Cross explores the best ways to improve K–8 science teaching p...

examples of critical thinking in science

Amplify Science

Amplify Science is a K–8 science curriculum that blends hands-on investigations, literacy-rich activities, and interactive digital tools to empower students to...

examples of critical thinking in science

Resource site

Amplify Science resource site

Inspiring the next generation of scientists, engineers, and curious citizens.

Sed posuere consectetur est at lobortis. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum.

examples of critical thinking in science

IMAGES

  1. Ultimate Critical Thinking Cheat Sheet

    examples of critical thinking in science

  2. CAST: Cultivating Critical Thinking in Science

    examples of critical thinking in science

  3. science-based-critical-thinking

    examples of critical thinking in science

  4. 6 Main Types of Critical Thinking Skills (With Examples)

    examples of critical thinking in science

  5. Critical Thinking Skills

    examples of critical thinking in science

  6. Definition and Examples of Critical Thinking

    examples of critical thinking in science

VIDEO

  1. PHONE BOOK FRICTION MYSTERY ANSWER #physics #friction #force #shorts #shortvideo#glue

  2. حوار9: هي روسيا عايزه إيه؟حوار مفتوح

  3. Critical Thinking, part 2

  4. DEFAULT POSITIONS

  5. SCIENTIFIC THINKING

  6. 5 Examples of Critical Thinking Skills (to Become a Pro Problem Solver)

COMMENTS

  1. Critical Thinking in Science: Fostering Scientific Reasoning Skills in

    Critical thinking is essential in science. It's what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought. It's not just about whether evidence is available to support a particular answer but how valid that evidence is. It's about whether the information the student ...

  2. 41+ Critical Thinking Examples (Definition + Practices)

    There are many resources to help you determine if information sources are factual or not. 7. Socratic Questioning. This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic.

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. Teaching critical thinking in science

    Scientific inquiry includes three key areas: 1. Identifying a problem and asking questions about that problem. 2. Selecting information to respond to the problem and evaluating it. 3. Drawing conclusions from the evidence. Critical thinking can be developed through focussed learning activities. Students not only need to receive information but ...

  5. What Are Critical Thinking Skills and Why Are They Important?

    The basis of science and democracy Critical thinking skills are used every day in a myriad of ways and can be applied to situations such as a CEO approaching a group project or a nurse deciding in which order to treat their patients. Examples of common critical thinking skills.

  6. Science-Based Strategies For Critical Thinking

    8 Science-Based Strategies For Critical Thinking. 1. Challenge all assumptions. And that means all assumptions. As a teacher, I've done my best to nurture the students' explorative questions by modeling the objective scientific mindset. Regardless of our goals in the teaching and learning process, I never want to squelch the curiosity of ...

  7. Critical Thinking in Science

    A research study on minimally altering traditional lab approaches to incorporate more critical thinking. The drag example was taken from this piece. ISLE, led by E. Etkina. A platform that helps teachers incorporate more critical thinking in physics labs. Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking.

  8. PDF The Nature of Scientific Thinking

    1. Creative and Critical Thinking: This involves coming up with new ideas, thinking outside the box, connecting imagination with logic, and then communicating these ideas to others.1 Many times these ideas go against the prevailing belief system. Here are some examples: Bonnie Bassler - (b. 1962; Discovered that bacteria communicate with chemical

  9. Understanding the Complex Relationship between Critical Thinking and

    This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, "understanding the process of science" requires students to engage in (and be metacognitive about) scientific reasoning, and having the "ability to interpret data" requires critical-thinking skills ...

  10. Scientific Thinking and Critical Thinking in Science Education

    Abstract. Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of ...

  11. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  12. Critical thinking in the lab (and beyond)

    Jon-Marc Rodriguez and Marcy Towns, researchers at Purdue University, US, recently outlined an approach to modify existing practical activities to promote critical thinking in students, supporting enhanced learning. Jon-Marc and Marcy focused on critical thinking as a skill needed for successful engagement with the eight 'science practices'.

  13. Fostering Students' Creativity and Critical Thinking in Science

    One example of lesson plans in science education grounded in project-based learning, and included in the OECD examples of courses, is now presented to give tangible ideas of how critical thinking and creativity can be developed in science education while also teaching technical skills of science (declarative and procedural knowledge).

  14. Critical Thinking Definition, Skills, and Examples

    Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful ...

  15. Understanding the Complex Relationship between Critical Thinking and

    Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students' development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in ...

  16. Students' and teachers' critical thinking in science education: are

    The reason could be that these measures are often considered a part of critical thinking (dispositions). For example, metacognition is considered as critical awareness and reflection of a person's own thinking process (e.g. Hanley Citation 1995). Self-confidence is another example (e.g., Kavenuke, Kinyota, and Kayombo Citation 2020).

  17. What influences students' abilities to critically evaluate scientific

    Critical thinking and its importance. Critical thinking, defined here as "the ways in which one uses data and evidence to make decisions about what to trust and what to do" [], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum.Beyond the classroom, critical thinking skills are important so that students ...

  18. What is CER in Science & Why It's Essential for Stud...

    CER is Claim, Evidence, and Reasoning. It's a three-step process that helps students develop critical thinking skills in science class. Students stake a claim by answering a question that they need to prove. The evidence comes from demonstrating understanding and proof of that answer. Reasoning brings in the explanation piece.

  19. PDF Questions to provoke thinking and discussion

    These questions get students thinking about how chemicals can be harmful and useful, and how we should think about naturally occurring and non-natural chemicals. You may Þnd it useful to encourage students to come up with examples of chemicals and their uses and potential dangers, before tackling the questions. For example: vitamin C is useful,

  20. 5 Critical Thinking Examples in Various Disciplines

    1. Environmental Science. One example of critical thinking in environmental science is analyzing the impact of human activities on ecosystems. By teaching students to evaluate the consequences of actions such as deforestation or pollution, they can develop a deeper understanding of the interconnectedness of the environment.

  21. (PDF) Scientific thinking and critical thinking in science education

    3.1 Critical thinking vs. scientific thinking in science education: differences and similarities In accordance with the above, it could be said that scientific thinking is nourished by criti-

  22. 25 Critical Thinking Examples (2024)

    25 Critical Thinking Examples. Critical thinking is the ability to analyze information and make reasoned decisions. It involves suspended judgment, open-mindedness, and clarity of thought. It involves considering different viewpoints and weighing evidence carefully. It is essential for solving complex problems and making good decisions.

  23. Pseudoscience examples for critical thinking skills

    Critical thinking exercises inspired by pseudoscience. We've talked about "miracle" hair growth treatments, which are more commonly targeted to adults. Students may have more commonly encountered claims about or ads for alkaline water or detox diets, conspiracy theories and instances of science denial, astrology, and more. These examples ...