Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

example of consistency in critical thinking

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

example of consistency in critical thinking

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

41+ Critical Thinking Examples (Definition + Practices)

practical psychology logo

Critical thinking is an essential skill in our information-overloaded world, where figuring out what is fact and fiction has become increasingly challenging.

But why is critical thinking essential? Put, critical thinking empowers us to make better decisions, challenge and validate our beliefs and assumptions, and understand and interact with the world more effectively and meaningfully.

Critical thinking is like using your brain's "superpowers" to make smart choices. Whether it's picking the right insurance, deciding what to do in a job, or discussing topics in school, thinking deeply helps a lot. In the next parts, we'll share real-life examples of when this superpower comes in handy and give you some fun exercises to practice it.

Critical Thinking Process Outline

a woman thinking

Critical thinking means thinking clearly and fairly without letting personal feelings get in the way. It's like being a detective, trying to solve a mystery by using clues and thinking hard about them.

It isn't always easy to think critically, as it can take a pretty smart person to see some of the questions that aren't being answered in a certain situation. But, we can train our brains to think more like puzzle solvers, which can help develop our critical thinking skills.

Here's what it looks like step by step:

Spotting the Problem: It's like discovering a puzzle to solve. You see that there's something you need to figure out or decide.

Collecting Clues: Now, you need to gather information. Maybe you read about it, watch a video, talk to people, or do some research. It's like getting all the pieces to solve your puzzle.

Breaking It Down: This is where you look at all your clues and try to see how they fit together. You're asking questions like: Why did this happen? What could happen next?

Checking Your Clues: You want to make sure your information is good. This means seeing if what you found out is true and if you can trust where it came from.

Making a Guess: After looking at all your clues, you think about what they mean and come up with an answer. This answer is like your best guess based on what you know.

Explaining Your Thoughts: Now, you tell others how you solved the puzzle. You explain how you thought about it and how you answered. 

Checking Your Work: This is like looking back and seeing if you missed anything. Did you make any mistakes? Did you let any personal feelings get in the way? This step helps make sure your thinking is clear and fair.

And remember, you might sometimes need to go back and redo some steps if you discover something new. If you realize you missed an important clue, you might have to go back and collect more information.

Critical Thinking Methods

Just like doing push-ups or running helps our bodies get stronger, there are special exercises that help our brains think better. These brain workouts push us to think harder, look at things closely, and ask many questions.

It's not always about finding the "right" answer. Instead, it's about the journey of thinking and asking "why" or "how." Doing these exercises often helps us become better thinkers and makes us curious to know more about the world.

Now, let's look at some brain workouts to help us think better:

1. "What If" Scenarios

Imagine crazy things happening, like, "What if there was no internet for a month? What would we do?" These games help us think of new and different ideas.

Pick a hot topic. Argue one side of it and then try arguing the opposite. This makes us see different viewpoints and think deeply about a topic.

3. Analyze Visual Data

Check out charts or pictures with lots of numbers and info but no explanations. What story are they telling? This helps us get better at understanding information just by looking at it.

4. Mind Mapping

Write an idea in the center and then draw lines to related ideas. It's like making a map of your thoughts. This helps us see how everything is connected.

There's lots of mind-mapping software , but it's also nice to do this by hand.

5. Weekly Diary

Every week, write about what happened, the choices you made, and what you learned. Writing helps us think about our actions and how we can do better.

6. Evaluating Information Sources

Collect stories or articles about one topic from newspapers or blogs. Which ones are trustworthy? Which ones might be a little biased? This teaches us to be smart about where we get our info.

There are many resources to help you determine if information sources are factual or not.

7. Socratic Questioning

This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic. You can do this by yourself or chat with a friend.

Start with a Big Question:

"What does 'success' mean?"

Dive Deeper with More Questions:

"Why do you think of success that way?" "Do TV shows, friends, or family make you think that?" "Does everyone think about success the same way?"

"Can someone be a winner even if they aren't rich or famous?" "Can someone feel like they didn't succeed, even if everyone else thinks they did?"

Look for Real-life Examples:

"Who is someone you think is successful? Why?" "Was there a time you felt like a winner? What happened?"

Think About Other People's Views:

"How might a person from another country think about success?" "Does the idea of success change as we grow up or as our life changes?"

Think About What It Means:

"How does your idea of success shape what you want in life?" "Are there problems with only wanting to be rich or famous?"

Look Back and Think:

"After talking about this, did your idea of success change? How?" "Did you learn something new about what success means?"

socratic dialogue statues

8. Six Thinking Hats 

Edward de Bono came up with a cool way to solve problems by thinking in six different ways, like wearing different colored hats. You can do this independently, but it might be more effective in a group so everyone can have a different hat color. Each color has its way of thinking:

White Hat (Facts): Just the facts! Ask, "What do we know? What do we need to find out?"

Red Hat (Feelings): Talk about feelings. Ask, "How do I feel about this?"

Black Hat (Careful Thinking): Be cautious. Ask, "What could go wrong?"

Yellow Hat (Positive Thinking): Look on the bright side. Ask, "What's good about this?"

Green Hat (Creative Thinking): Think of new ideas. Ask, "What's another way to look at this?"

Blue Hat (Planning): Organize the talk. Ask, "What should we do next?"

When using this method with a group:

  • Explain all the hats.
  • Decide which hat to wear first.
  • Make sure everyone switches hats at the same time.
  • Finish with the Blue Hat to plan the next steps.

9. SWOT Analysis

SWOT Analysis is like a game plan for businesses to know where they stand and where they should go. "SWOT" stands for Strengths, Weaknesses, Opportunities, and Threats.

There are a lot of SWOT templates out there for how to do this visually, but you can also think it through. It doesn't just apply to businesses but can be a good way to decide if a project you're working on is working.

Strengths: What's working well? Ask, "What are we good at?"

Weaknesses: Where can we do better? Ask, "Where can we improve?"

Opportunities: What good things might come our way? Ask, "What chances can we grab?"

Threats: What challenges might we face? Ask, "What might make things tough for us?"

Steps to do a SWOT Analysis:

  • Goal: Decide what you want to find out.
  • Research: Learn about your business and the world around it.
  • Brainstorm: Get a group and think together. Talk about strengths, weaknesses, opportunities, and threats.
  • Pick the Most Important Points: Some things might be more urgent or important than others.
  • Make a Plan: Decide what to do based on your SWOT list.
  • Check Again Later: Things change, so look at your SWOT again after a while to update it.

Now that you have a few tools for thinking critically, let’s get into some specific examples.

Everyday Examples

Life is a series of decisions. From the moment we wake up, we're faced with choices – some trivial, like choosing a breakfast cereal, and some more significant, like buying a home or confronting an ethical dilemma at work. While it might seem that these decisions are disparate, they all benefit from the application of critical thinking.

10. Deciding to buy something

Imagine you want a new phone. Don't just buy it because the ad looks cool. Think about what you need in a phone. Look up different phones and see what people say about them. Choose the one that's the best deal for what you want.

11. Deciding what is true

There's a lot of news everywhere. Don't believe everything right away. Think about why someone might be telling you this. Check if what you're reading or watching is true. Make up your mind after you've looked into it.

12. Deciding when you’re wrong

Sometimes, friends can have disagreements. Don't just get mad right away. Try to see where they're coming from. Talk about what's going on. Find a way to fix the problem that's fair for everyone.

13. Deciding what to eat

There's always a new diet or exercise that's popular. Don't just follow it because it's trendy. Find out if it's good for you. Ask someone who knows, like a doctor. Make choices that make you feel good and stay healthy.

14. Deciding what to do today

Everyone is busy with school, chores, and hobbies. Make a list of things you need to do. Decide which ones are most important. Plan your day so you can get things done and still have fun.

15. Making Tough Choices

Sometimes, it's hard to know what's right. Think about how each choice will affect you and others. Talk to people you trust about it. Choose what feels right in your heart and is fair to others.

16. Planning for the Future

Big decisions, like where to go to school, can be tricky. Think about what you want in the future. Look at the good and bad of each choice. Talk to people who know about it. Pick what feels best for your dreams and goals.

choosing a house

Job Examples

17. solving problems.

Workers brainstorm ways to fix a machine quickly without making things worse when a machine breaks at a factory.

18. Decision Making

A store manager decides which products to order more of based on what's selling best.

19. Setting Goals

A team leader helps their team decide what tasks are most important to finish this month and which can wait.

20. Evaluating Ideas

At a team meeting, everyone shares ideas for a new project. The group discusses each idea's pros and cons before picking one.

21. Handling Conflict

Two workers disagree on how to do a job. Instead of arguing, they talk calmly, listen to each other, and find a solution they both like.

22. Improving Processes

A cashier thinks of a faster way to ring up items so customers don't have to wait as long.

23. Asking Questions

Before starting a big task, an employee asks for clear instructions and checks if they have the necessary tools.

24. Checking Facts

Before presenting a report, someone double-checks all their information to make sure there are no mistakes.

25. Planning for the Future

A business owner thinks about what might happen in the next few years, like new competitors or changes in what customers want, and makes plans based on those thoughts.

26. Understanding Perspectives

A team is designing a new toy. They think about what kids and parents would both like instead of just what they think is fun.

School Examples

27. researching a topic.

For a history project, a student looks up different sources to understand an event from multiple viewpoints.

28. Debating an Issue

In a class discussion, students pick sides on a topic, like school uniforms, and share reasons to support their views.

29. Evaluating Sources

While writing an essay, a student checks if the information from a website is trustworthy or might be biased.

30. Problem Solving in Math

When stuck on a tricky math problem, a student tries different methods to find the answer instead of giving up.

31. Analyzing Literature

In English class, students discuss why a character in a book made certain choices and what those decisions reveal about them.

32. Testing a Hypothesis

For a science experiment, students guess what will happen and then conduct tests to see if they're right or wrong.

33. Giving Peer Feedback

After reading a classmate's essay, a student offers suggestions for improving it.

34. Questioning Assumptions

In a geography lesson, students consider why certain countries are called "developed" and what that label means.

35. Designing a Study

For a psychology project, students plan an experiment to understand how people's memories work and think of ways to ensure accurate results.

36. Interpreting Data

In a science class, students look at charts and graphs from a study, then discuss what the information tells them and if there are any patterns.

Critical Thinking Puzzles

critical thinking tree

Not all scenarios will have a single correct answer that can be figured out by thinking critically. Sometimes we have to think critically about ethical choices or moral behaviors. 

Here are some mind games and scenarios you can solve using critical thinking. You can see the solution(s) at the end of the post.

37. The Farmer, Fox, Chicken, and Grain Problem

A farmer is at a riverbank with a fox, a chicken, and a grain bag. He needs to get all three items across the river. However, his boat can only carry himself and one of the three items at a time. 

Here's the challenge:

  • If the fox is left alone with the chicken, the fox will eat the chicken.
  • If the chicken is left alone with the grain, the chicken will eat the grain.

How can the farmer get all three items across the river without any item being eaten? 

38. The Rope, Jar, and Pebbles Problem

You are in a room with two long ropes hanging from the ceiling. Each rope is just out of arm's reach from the other, so you can't hold onto one rope and reach the other simultaneously. 

Your task is to tie the two rope ends together, but you can't move the position where they hang from the ceiling.

You are given a jar full of pebbles. How do you complete the task?

39. The Two Guards Problem

Imagine there are two doors. One door leads to certain doom, and the other leads to freedom. You don't know which is which.

In front of each door stands a guard. One guard always tells the truth. The other guard always lies. You don't know which guard is which.

You can ask only one question to one of the guards. What question should you ask to find the door that leads to freedom?

40. The Hourglass Problem

You have two hourglasses. One measures 7 minutes when turned over, and the other measures 4 minutes. Using just these hourglasses, how can you time exactly 9 minutes?

41. The Lifeboat Dilemma

Imagine you're on a ship that's sinking. You get on a lifeboat, but it's already too full and might flip over. 

Nearby in the water, five people are struggling: a scientist close to finding a cure for a sickness, an old couple who've been together for a long time, a mom with three kids waiting at home, and a tired teenager who helped save others but is now in danger. 

You can only save one person without making the boat flip. Who would you choose?

42. The Tech Dilemma

You work at a tech company and help make a computer program to help small businesses. You're almost ready to share it with everyone, but you find out there might be a small chance it has a problem that could show users' private info. 

If you decide to fix it, you must wait two more months before sharing it. But your bosses want you to share it now. What would you do?

43. The History Mystery

Dr. Amelia is a history expert. She's studying where a group of people traveled long ago. She reads old letters and documents to learn about it. But she finds some letters that tell a different story than what most people believe. 

If she says this new story is true, it could change what people learn in school and what they think about history. What should she do?

The Role of Bias in Critical Thinking

Have you ever decided you don’t like someone before you even know them? Or maybe someone shared an idea with you that you immediately loved without even knowing all the details. 

This experience is called bias, which occurs when you like or dislike something or someone without a good reason or knowing why. It can also take shape in certain reactions to situations, like a habit or instinct. 

Bias comes from our own experiences, what friends or family tell us, or even things we are born believing. Sometimes, bias can help us stay safe, but other times it stops us from seeing the truth.

Not all bias is bad. Bias can be a mechanism for assessing our potential safety in a new situation. If we are biased to think that anything long, thin, and curled up is a snake, we might assume the rope is something to be afraid of before we know it is just a rope.

While bias might serve us in some situations (like jumping out of the way of an actual snake before we have time to process that we need to be jumping out of the way), it often harms our ability to think critically.

How Bias Gets in the Way of Good Thinking

Selective Perception: We only notice things that match our ideas and ignore the rest. 

It's like only picking red candies from a mixed bowl because you think they taste the best, but they taste the same as every other candy in the bowl. It could also be when we see all the signs that our partner is cheating on us but choose to ignore them because we are happy the way we are (or at least, we think we are).

Agreeing with Yourself: This is called “ confirmation bias ” when we only listen to ideas that match our own and seek, interpret, and remember information in a way that confirms what we already think we know or believe. 

An example is when someone wants to know if it is safe to vaccinate their children but already believes that vaccines are not safe, so they only look for information supporting the idea that vaccines are bad.

Thinking We Know It All: Similar to confirmation bias, this is called “overconfidence bias.” Sometimes we think our ideas are the best and don't listen to others. This can stop us from learning.

Have you ever met someone who you consider a “know it”? Probably, they have a lot of overconfidence bias because while they may know many things accurately, they can’t know everything. Still, if they act like they do, they show overconfidence bias.

There's a weird kind of bias similar to this called the Dunning Kruger Effect, and that is when someone is bad at what they do, but they believe and act like they are the best .

Following the Crowd: This is formally called “groupthink”. It's hard to speak up with a different idea if everyone agrees. But this can lead to mistakes.

An example of this we’ve all likely seen is the cool clique in primary school. There is usually one person that is the head of the group, the “coolest kid in school”, and everyone listens to them and does what they want, even if they don’t think it’s a good idea.

How to Overcome Biases

Here are a few ways to learn to think better, free from our biases (or at least aware of them!).

Know Your Biases: Realize that everyone has biases. If we know about them, we can think better.

Listen to Different People: Talking to different kinds of people can give us new ideas.

Ask Why: Always ask yourself why you believe something. Is it true, or is it just a bias?

Understand Others: Try to think about how others feel. It helps you see things in new ways.

Keep Learning: Always be curious and open to new information.

city in a globe connection

In today's world, everything changes fast, and there's so much information everywhere. This makes critical thinking super important. It helps us distinguish between what's real and what's made up. It also helps us make good choices. But thinking this way can be tough sometimes because of biases. These are like sneaky thoughts that can trick us. The good news is we can learn to see them and think better.

There are cool tools and ways we've talked about, like the "Socratic Questioning" method and the "Six Thinking Hats." These tools help us get better at thinking. These thinking skills can also help us in school, work, and everyday life.

We’ve also looked at specific scenarios where critical thinking would be helpful, such as deciding what diet to follow and checking facts.

Thinking isn't just a skill—it's a special talent we improve over time. Working on it lets us see things more clearly and understand the world better. So, keep practicing and asking questions! It'll make you a smarter thinker and help you see the world differently.

Critical Thinking Puzzles (Solutions)

The farmer, fox, chicken, and grain problem.

  • The farmer first takes the chicken across the river and leaves it on the other side.
  • He returns to the original side and takes the fox across the river.
  • After leaving the fox on the other side, he returns the chicken to the starting side.
  • He leaves the chicken on the starting side and takes the grain bag across the river.
  • He leaves the grain with the fox on the other side and returns to get the chicken.
  • The farmer takes the chicken across, and now all three items -- the fox, the chicken, and the grain -- are safely on the other side of the river.

The Rope, Jar, and Pebbles Problem

  • Take one rope and tie the jar of pebbles to its end.
  • Swing the rope with the jar in a pendulum motion.
  • While the rope is swinging, grab the other rope and wait.
  • As the swinging rope comes back within reach due to its pendulum motion, grab it.
  • With both ropes within reach, untie the jar and tie the rope ends together.

The Two Guards Problem

The question is, "What would the other guard say is the door to doom?" Then choose the opposite door.

The Hourglass Problem

  • Start both hourglasses. 
  • When the 4-minute hourglass runs out, turn it over.
  • When the 7-minute hourglass runs out, the 4-minute hourglass will have been running for 3 minutes. Turn the 7-minute hourglass over. 
  • When the 4-minute hourglass runs out for the second time (a total of 8 minutes have passed), the 7-minute hourglass will run for 1 minute. Turn the 7-minute hourglass again for 1 minute to empty the hourglass (a total of 9 minutes passed).

The Boat and Weights Problem

Take the cat over first and leave it on the other side. Then, return and take the fish across next. When you get there, take the cat back with you. Leave the cat on the starting side and take the cat food across. Lastly, return to get the cat and bring it to the other side.

The Lifeboat Dilemma

There isn’t one correct answer to this problem. Here are some elements to consider:

  • Moral Principles: What values guide your decision? Is it the potential greater good for humanity (the scientist)? What is the value of long-standing love and commitment (the elderly couple)? What is the future of young children who depend on their mothers? Or the selfless bravery of the teenager?
  • Future Implications: Consider the future consequences of each choice. Saving the scientist might benefit millions in the future, but what moral message does it send about the value of individual lives?
  • Emotional vs. Logical Thinking: While it's essential to engage empathy, it's also crucial not to let emotions cloud judgment entirely. For instance, while the teenager's bravery is commendable, does it make him more deserving of a spot on the boat than the others?
  • Acknowledging Uncertainty: The scientist claims to be close to a significant breakthrough, but there's no certainty. How does this uncertainty factor into your decision?
  • Personal Bias: Recognize and challenge any personal biases, such as biases towards age, profession, or familial status.

The Tech Dilemma

Again, there isn’t one correct answer to this problem. Here are some elements to consider:

  • Evaluate the Risk: How severe is the potential vulnerability? Can it be easily exploited, or would it require significant expertise? Even if the circumstances are rare, what would be the consequences if the vulnerability were exploited?
  • Stakeholder Considerations: Different stakeholders will have different priorities. Upper management might prioritize financial projections, the marketing team might be concerned about the product's reputation, and customers might prioritize the security of their data. How do you balance these competing interests?
  • Short-Term vs. Long-Term Implications: While launching on time could meet immediate financial goals, consider the potential long-term damage to the company's reputation if the vulnerability is exploited. Would the short-term gains be worth the potential long-term costs?
  • Ethical Implications : Beyond the financial and reputational aspects, there's an ethical dimension to consider. Is it right to release a product with a known vulnerability, even if the chances of it being exploited are low?
  • Seek External Input: Consulting with cybersecurity experts outside your company might be beneficial. They could provide a more objective risk assessment and potential mitigation strategies.
  • Communication: How will you communicate the decision, whatever it may be, both internally to your team and upper management and externally to your customers and potential users?

The History Mystery

Dr. Amelia should take the following steps:

  • Verify the Letters: Before making any claims, she should check if the letters are actual and not fake. She can do this by seeing when and where they were written and if they match with other things from that time.
  • Get a Second Opinion: It's always good to have someone else look at what you've found. Dr. Amelia could show the letters to other history experts and see their thoughts.
  • Research More: Maybe there are more documents or letters out there that support this new story. Dr. Amelia should keep looking to see if she can find more evidence.
  • Share the Findings: If Dr. Amelia believes the letters are true after all her checks, she should tell others. This can be through books, talks, or articles.
  • Stay Open to Feedback: Some people might agree with Dr. Amelia, and others might not. She should listen to everyone and be ready to learn more or change her mind if new information arises.

Ultimately, Dr. Amelia's job is to find out the truth about history and share it. It's okay if this new truth differs from what people used to believe. History is about learning from the past, no matter the story.

Related posts:

  • Experimenter Bias (Definition + Examples)
  • Hasty Generalization Fallacy (31 Examples + Similar Names)
  • Ad Hoc Fallacy (29 Examples + Other Names)
  • Confirmation Bias (Examples + Definition)
  • Equivocation Fallacy (26 Examples + Description)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

Michael W. Austin Ph.D.

Standards of Critical Thinking

Thinking towards truth..

Posted June 11, 2012 | Reviewed by Ekua Hagan

  • What Is Cognition?
  • Find counselling near me

What is critical thinking? According to my favorite critical thinking text , it is disciplined thinking that is governed by clear intellectual standards.

This involves identifying and analyzing arguments and truth claims, discovering and overcoming prejudices and biases, developing your own reasons and arguments in favor of what you believe, considering objections to your beliefs, and making rational choices about what to do based on your beliefs.

Clarity is an important standard of critical thought. Clarity of communication is one aspect of this. We must be clear in how we communicate our thoughts, beliefs, and reasons for those beliefs.

Careful attention to language is essential here. For example, when we talk about morality , one person may have in mind the conventional morality of a particular community, while another may be thinking of certain transcultural standards of morality. Defining our terms can greatly aid us in the quest for clarity.

Clarity of thought is important as well; this means that we clearly understand what we believe, and why we believe it.

Precision involves working hard at getting the issue under consideration before our minds in a particular way. One way to do this is to ask the following questions: What is the problem at issue? What are the possible answers? What are the strengths and weaknesses of each answer?

Accuracy is unquestionably essential to critical thinking. In order to get at or closer to the truth, critical thinkers seek accurate and adequate information. They want the facts because they need the right information before they can move forward and analyze it.

Relevance means that the information and ideas discussed must be logically relevant to the issue being discussed. Many pundits and politicians are great at distracting us away from this.

Consistency is a key aspect of critical thinking. Our beliefs should be consistent. We shouldn’t hold beliefs that are contradictory. If we find that we do hold contradictory beliefs, then one or both of those beliefs are false. For example, I would likely contradict myself if I believed both that " Racism is always immoral" and "Morality is entirely relative." This is a logical inconsistency.

There is another form of inconsistency, called practical inconsistency, which involves saying you believe one thing while doing another. For example, if I say that I believe my family is more important than my work, but I tend to sacrifice their interests for the sake of my work, then I am being practically inconsistent.

The last three standards are logical correctness, completeness, and fairness. Logical correctness means that one is engaging in correct reasoning from what we believe in a given instance to the conclusions that follow from those beliefs. Completeness means that we engage in deep and thorough thinking and evaluation, avoiding shallow and superficial thought and criticism. Fairness involves seeking to be open-minded, impartial, and free of biases and preconceptions that distort our thinking.

Like any skill or set of skills, getting better at critical thinking requires practice. Anyone wanting to grow in this area might think through these standards and apply them to an editorial in the newspaper or on the web, a blog post, or even their own beliefs. Doing so can be a useful and often meaningful exercise.

Michael W. Austin Ph.D.

Michael W. Austin, Ph.D. , is a professor of philosophy at Eastern Kentucky University.

  • Find Counselling
  • Find a Support Group
  • Find Online Therapy
  • Richmond - Tweed
  • Newcastle - Maitland
  • Canberra - ACT
  • Sunshine Coast
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

Think Smarter: Critical Thinking to Improve Problem-Solving and Decision-Making Skills by Michael Kallet

Get full access to Think Smarter: Critical Thinking to Improve Problem-Solving and Decision-Making Skills and 60K+ other titles, with a free 10-day trial of O'Reilly.

There are also live events, courses curated by job role, and more.

23 Consistency

The consistency of your premise components.

Another tool that ensures a strong premise is consistency , which is the way that premise elements support each other. For example, let's say you read glowing reviews for a restaurant on several review websites—dozens of them, and all were great. A few of your friends also ate at the restaurant and loved it too, and the American Automobile Association (AAA) rated it high. A newspaper article on the place rated it top-notch. The restaurant has been in business for 23 years. These observations are very consistent and would make you confident the restaurant is good. Then you read one review from a person who hated the place and said the service was slow, the food was cold, and the server was nasty. This review is inconsistent with everything else—but because there are so many positive reviews against the one negative review, you probably would discount it.

Premises are stronger when their components are consistent. However, inconsistency isn't necessarily a bad thing; it identifies conflicting information, which yields a suspicious, weak premise. When you understand why there's conflicting information, or if you can resolve the inconsistency, your premise becomes stronger. Let's say you're shopping online at Amazon.com . You're looking for an item you usually buy for around $20. Most of the items you observe online are also in the $20 range, but you see one for $14. Because that price is inconsistent with your experience and ...

Get Think Smarter: Critical Thinking to Improve Problem-Solving and Decision-Making Skills now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Don’t leave empty-handed

Get Mark Richards’s Software Architecture Patterns ebook to better understand how to design components—and how they should interact.

It’s yours, free.

Cover of Software Architecture Patterns

Check it out now on O’Reilly

Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.

example of consistency in critical thinking

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

example of consistency in critical thinking

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. Employers prioritize the ability to think critically—find out why, plus see how you can demonstrate that you have this ability throughout the job application process. 

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

 Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter, and during your interview.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Promote Your Skills in Your Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your  work history , include top critical thinking skills that accurately describe you. You can also include them in your  resume summary , if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand critical thinking skills in mind as you update your resume and write your cover letter. As you've seen, you can also emphasize them at other points throughout the application process, such as your interview. 

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with  analytical skills  can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns

Communication

Often, you will need to share your conclusions with your employers or with a group of colleagues. You need to be able to  communicate with others  to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing

Open-Mindedness

To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate that you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • Critical Thinking in Reading and Composition
  • Bloom's Taxonomy in the Classroom
  • Introduction to Critical Thinking
  • How To Become an Effective Problem Solver
  • Creativity & Creative Thinking
  • Higher-Order Thinking Skills (HOTS) in Education
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • 6 Skills Students Need to Succeed in Social Studies Classes
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • Types of Medical School Interviews and What to Expect
  • The Horse Problem: A Math Challenge
  • What to Do When the Technology Fails in Class
  • What Are Your Strengths and Weaknesses? Interview Tips for Teachers
  • A Guide to Business Letters Types
  • How to Practice Critical Thinking in 4 Steps
  • Landing Your First Teaching Job

SkillsYouNeed

  • LEARNING SKILLS
  • Study Skills
  • Critical Thinking

Search SkillsYouNeed:

Learning Skills:

  • A - Z List of Learning Skills
  • What is Learning?
  • Learning Approaches
  • Learning Styles
  • 8 Types of Learning Styles
  • Understanding Your Preferences to Aid Learning
  • Lifelong Learning
  • Decisions to Make Before Applying to University
  • Top Tips for Surviving Student Life
  • Living Online: Education and Learning
  • 8 Ways to Embrace Technology-Based Learning Approaches

Critical Thinking Skills

  • Critical Thinking and Fake News
  • Understanding and Addressing Conspiracy Theories
  • Critical Analysis
  • Top Tips for Study
  • Staying Motivated When Studying
  • Student Budgeting and Economic Skills
  • Getting Organised for Study
  • Finding Time to Study
  • Sources of Information
  • Assessing Internet Information
  • Using Apps to Support Study
  • What is Theory?
  • Styles of Writing
  • Effective Reading
  • Critical Reading
  • Note-Taking from Reading
  • Note-Taking for Verbal Exchanges
  • Planning an Essay
  • How to Write an Essay
  • The Do’s and Don’ts of Essay Writing
  • How to Write a Report
  • Academic Referencing
  • Assignment Finishing Touches
  • Reflecting on Marked Work
  • 6 Skills You Learn in School That You Use in Real Life
  • Top 10 Tips on How to Study While Working
  • Exam Skills
  • Writing a Dissertation or Thesis
  • Research Methods
  • Teaching, Coaching, Mentoring and Counselling
  • Employability Skills for Graduates

Subscribe to our FREE newsletter and start improving your life in just 5 minutes a day.

You'll get our 5 free 'One Minute Life Skills' and our weekly newsletter.

We'll never share your email address and you can unsubscribe at any time.

What is Critical Thinking?

Critical thinking is the ability to think clearly and rationally, understanding the logical connection between ideas.  Critical thinking has been the subject of much debate and thought since the time of early Greek philosophers such as Plato and Socrates and has continued to be a subject of discussion into the modern age, for example the ability to recognise fake news .

Critical thinking might be described as the ability to engage in reflective and independent thinking.

In essence, critical thinking requires you to use your ability to reason. It is about being an active learner rather than a passive recipient of information.

Critical thinkers rigorously question ideas and assumptions rather than accepting them at face value. They will always seek to determine whether the ideas, arguments and findings represent the entire picture and are open to finding that they do not.

Critical thinkers will identify, analyse and solve problems systematically rather than by intuition or instinct.

Someone with critical thinking skills can:

Understand the links between ideas.

Determine the importance and relevance of arguments and ideas.

Recognise, build and appraise arguments.

Identify inconsistencies and errors in reasoning.

Approach problems in a consistent and systematic way.

Reflect on the justification of their own assumptions, beliefs and values.

Critical thinking is thinking about things in certain ways so as to arrive at the best possible solution in the circumstances that the thinker is aware of. In more everyday language, it is a way of thinking about whatever is presently occupying your mind so that you come to the best possible conclusion.

Critical Thinking is:

A way of thinking about particular things at a particular time; it is not the accumulation of facts and knowledge or something that you can learn once and then use in that form forever, such as the nine times table you learn and use in school.

The Skills We Need for Critical Thinking

The skills that we need in order to be able to think critically are varied and include observation, analysis, interpretation, reflection, evaluation, inference, explanation, problem solving, and decision making.

Specifically we need to be able to:

Think about a topic or issue in an objective and critical way.

Identify the different arguments there are in relation to a particular issue.

Evaluate a point of view to determine how strong or valid it is.

Recognise any weaknesses or negative points that there are in the evidence or argument.

Notice what implications there might be behind a statement or argument.

Provide structured reasoning and support for an argument that we wish to make.

The Critical Thinking Process

You should be aware that none of us think critically all the time.

Sometimes we think in almost any way but critically, for example when our self-control is affected by anger, grief or joy or when we are feeling just plain ‘bloody minded’.

On the other hand, the good news is that, since our critical thinking ability varies according to our current mindset, most of the time we can learn to improve our critical thinking ability by developing certain routine activities and applying them to all problems that present themselves.

Once you understand the theory of critical thinking, improving your critical thinking skills takes persistence and practice.

Try this simple exercise to help you to start thinking critically.

Think of something that someone has recently told you. Then ask yourself the following questions:

Who said it?

Someone you know? Someone in a position of authority or power? Does it matter who told you this?

What did they say?

Did they give facts or opinions? Did they provide all the facts? Did they leave anything out?

Where did they say it?

Was it in public or in private? Did other people have a chance to respond an provide an alternative account?

When did they say it?

Was it before, during or after an important event? Is timing important?

Why did they say it?

Did they explain the reasoning behind their opinion? Were they trying to make someone look good or bad?

How did they say it?

Were they happy or sad, angry or indifferent? Did they write it or say it? Could you understand what was said?

What are you Aiming to Achieve?

One of the most important aspects of critical thinking is to decide what you are aiming to achieve and then make a decision based on a range of possibilities.

Once you have clarified that aim for yourself you should use it as the starting point in all future situations requiring thought and, possibly, further decision making. Where needed, make your workmates, family or those around you aware of your intention to pursue this goal. You must then discipline yourself to keep on track until changing circumstances mean you have to revisit the start of the decision making process.

However, there are things that get in the way of simple decision making. We all carry with us a range of likes and dislikes, learnt behaviours and personal preferences developed throughout our lives; they are the hallmarks of being human. A major contribution to ensuring we think critically is to be aware of these personal characteristics, preferences and biases and make allowance for them when considering possible next steps, whether they are at the pre-action consideration stage or as part of a rethink caused by unexpected or unforeseen impediments to continued progress.

The more clearly we are aware of ourselves, our strengths and weaknesses, the more likely our critical thinking will be productive.

The Benefit of Foresight

Perhaps the most important element of thinking critically is foresight.

Almost all decisions we make and implement don’t prove disastrous if we find reasons to abandon them. However, our decision making will be infinitely better and more likely to lead to success if, when we reach a tentative conclusion, we pause and consider the impact on the people and activities around us.

The elements needing consideration are generally numerous and varied. In many cases, consideration of one element from a different perspective will reveal potential dangers in pursuing our decision.

For instance, moving a business activity to a new location may improve potential output considerably but it may also lead to the loss of skilled workers if the distance moved is too great. Which of these is the more important consideration? Is there some way of lessening the conflict?

These are the sort of problems that may arise from incomplete critical thinking, a demonstration perhaps of the critical importance of good critical thinking.

Further Reading from Skills You Need

The Skills You Need Guide for Students

The Skills You Need Guide for Students

Skills You Need

Develop the skills you need to make the most of your time as a student.

Our eBooks are ideal for students at all stages of education, school, college and university. They are full of easy-to-follow practical information that will help you to learn more effectively and get better grades.

In Summary:

Critical thinking is aimed at achieving the best possible outcomes in any situation. In order to achieve this it must involve gathering and evaluating information from as many different sources possible.

Critical thinking requires a clear, often uncomfortable, assessment of your personal strengths, weaknesses and preferences and their possible impact on decisions you may make.

Critical thinking requires the development and use of foresight as far as this is possible. As Doris Day sang, “the future’s not ours to see”.

Implementing the decisions made arising from critical thinking must take into account an assessment of possible outcomes and ways of avoiding potentially negative outcomes, or at least lessening their impact.

  • Critical thinking involves reviewing the results of the application of decisions made and implementing change where possible.

It might be thought that we are overextending our demands on critical thinking in expecting that it can help to construct focused meaning rather than examining the information given and the knowledge we have acquired to see if we can, if necessary, construct a meaning that will be acceptable and useful.

After all, almost no information we have available to us, either externally or internally, carries any guarantee of its life or appropriateness.  Neat step-by-step instructions may provide some sort of trellis on which our basic understanding of critical thinking can blossom but it doesn’t and cannot provide any assurance of certainty, utility or longevity.

Continue to: Critical Thinking and Fake News Critical Reading

See also: Analytical Skills Understanding and Addressing Conspiracy Theories Introduction to Neuro-Linguistic Programming (NLP)

BRIEF RESEARCH REPORT article

How do critical thinking ability and critical thinking disposition relate to the mental health of university students.

\nZhiyuan Liu

  • School of Education, Huazhong University of Science and Technology, Wuhan, China

Theories of psychotherapy suggest that human mental problems associate with deficiencies in critical thinking. However, it currently remains unclear whether both critical thinking skill and critical thinking disposition relate to individual differences in mental health. This study explored whether and how the critical thinking ability and critical thinking disposition of university students associate with individual differences in mental health in considering impulsivity that has been revealed to be closely related to both critical thinking and mental health. Regression and structural equation modeling analyses based on a Chinese university student sample ( N = 314, 198 females, M age = 18.65) revealed that critical thinking skill and disposition explained a unique variance of mental health after controlling for impulsivity. Furthermore, the relationship between critical thinking and mental health was mediated by motor impulsivity (acting on the spur of the moment) and non-planning impulsivity (making decisions without careful forethought). These findings provide a preliminary account of how human critical thinking associate with mental health. Practically, developing mental health promotion programs for university students is suggested to pay special attention to cultivating their critical thinking dispositions and enhancing their control over impulsive behavior.

Introduction

Although there is no consistent definition of critical thinking (CT), it is usually described as “purposeful, self-regulatory judgment that results in interpretation, analysis, evaluation, and inference, as well as explanations of the evidential, conceptual, methodological, criteriological, or contextual considerations that judgment is based upon” ( Facione, 1990 , p. 2). This suggests that CT is a combination of skills and dispositions. The skill aspect mainly refers to higher-order cognitive skills such as inference, analysis, and evaluation, while the disposition aspect represents one's consistent motivation and willingness to use CT skills ( Dwyer, 2017 ). An increasing number of studies have indicated that CT plays crucial roles in the activities of university students such as their academic performance (e.g., Ghanizadeh, 2017 ; Ren et al., 2020 ), professional work (e.g., Barry et al., 2020 ), and even the ability to cope with life events (e.g., Butler et al., 2017 ). An area that has received less attention is how critical thinking relates to impulsivity and mental health. This study aimed to clarify the relationship between CT (which included both CT skill and CT disposition), impulsivity, and mental health among university students.

Relationship Between Critical Thinking and Mental Health

Associating critical thinking with mental health is not without reason, since theories of psychotherapy have long stressed a linkage between mental problems and dysfunctional thinking ( Gilbert, 2003 ; Gambrill, 2005 ; Cuijpers, 2019 ). Proponents of cognitive behavioral therapy suggest that the interpretation by people of a situation affects their emotional, behavioral, and physiological reactions. Those with mental problems are inclined to bias or heuristic thinking and are more likely to misinterpret neutral or even positive situations ( Hollon and Beck, 2013 ). Therefore, a main goal of cognitive behavioral therapy is to overcome biased thinking and change maladaptive beliefs via cognitive modification skills such as objective understanding of one's cognitive distortions, analyzing evidence for and against one's automatic thinking, or testing the effect of an alternative way of thinking. Achieving these therapeutic goals requires the involvement of critical thinking, such as the willingness and ability to critically analyze one's thoughts and evaluate evidence and arguments independently of one's prior beliefs. In addition to theoretical underpinnings, characteristics of university students also suggest a relationship between CT and mental health. University students are a risky population in terms of mental health. They face many normative transitions (e.g., social and romantic relationships, important exams, financial pressures), which are stressful ( Duffy et al., 2019 ). In particular, the risk increases when students experience academic failure ( Lee et al., 2008 ; Mamun et al., 2021 ). Hong et al. (2010) found that the stress in Chinese college students was primarily related to academic, personal, and negative life events. However, university students are also a population with many resources to work on. Critical thinking can be considered one of the important resources that students are able to use ( Stupple et al., 2017 ). Both CT skills and CT disposition are valuable qualities for college students to possess ( Facione, 1990 ). There is evidence showing that students with a higher level of CT are more successful in terms of academic performance ( Ghanizadeh, 2017 ; Ren et al., 2020 ), and that they are better at coping with stressful events ( Butler et al., 2017 ). This suggests that that students with higher CT are less likely to suffer from mental problems.

Empirical research has reported an association between CT and mental health among college students ( Suliman and Halabi, 2007 ; Kargar et al., 2013 ; Yoshinori and Marcus, 2013 ; Chen and Hwang, 2020 ; Ugwuozor et al., 2021 ). Most of these studies focused on the relationship between CT disposition and mental health. For example, Suliman and Halabi (2007) reported that the CT disposition of nursing students was positively correlated with their self-esteem, but was negatively correlated with their state anxiety. There is also a research study demonstrating that CT disposition influenced the intensity of worry in college students either by increasing their responsibility to continue thinking or by enhancing the detached awareness of negative thoughts ( Yoshinori and Marcus, 2013 ). Regarding the relationship between CT ability and mental health, although there has been no direct evidence, there were educational programs examining the effect of teaching CT skills on the mental health of adolescents ( Kargar et al., 2013 ). The results showed that teaching CT skills decreased somatic symptoms, anxiety, depression, and insomnia in adolescents. Another recent CT skill intervention also found a significant reduction in mental stress among university students, suggesting an association between CT skills and mental health ( Ugwuozor et al., 2021 ).

The above research provides preliminary evidence in favor of the relationship between CT and mental health, in line with theories of CT and psychotherapy. However, previous studies have focused solely on the disposition aspect of CT, and its link with mental health. The ability aspect of CT has been largely overlooked in examining its relationship with mental health. Moreover, although the link between CT and mental health has been reported, it remains unknown how CT (including skill and disposition) is associated with mental health.

Impulsivity as a Potential Mediator Between Critical Thinking and Mental Health

One important factor suggested by previous research in accounting for the relationship between CT and mental health is impulsivity. Impulsivity is recognized as a pattern of action without regard to consequences. Patton et al. (1995) proposed that impulsivity is a multi-faceted construct that consists of three behavioral factors, namely, non-planning impulsiveness, referring to making a decision without careful forethought; motor impulsiveness, referring to acting on the spur of the moment; and attentional impulsiveness, referring to one's inability to focus on the task at hand. Impulsivity is prominent in clinical problems associated with psychiatric disorders ( Fortgang et al., 2016 ). A number of mental problems are associated with increased impulsivity that is likely to aggravate clinical illnesses ( Leclair et al., 2020 ). Moreover, a lack of CT is correlated with poor impulse control ( Franco et al., 2017 ). Applications of CT may reduce impulsive behaviors caused by heuristic and biased thinking when one makes a decision ( West et al., 2008 ). For example, Gregory (1991) suggested that CT skills enhance the ability of children to anticipate the health or safety consequences of a decision. Given this, those with high levels of CT are expected to take a rigorous attitude about the consequences of actions and are less likely to engage in impulsive behaviors, which may place them at a low risk of suffering mental problems. To the knowledge of the authors, no study has empirically tested whether impulsivity accounts for the relationship between CT and mental health.

This study examined whether CT skill and disposition are related to the mental health of university students; and if yes, how the relationship works. First, we examined the simultaneous effects of CT ability and CT disposition on mental health. Second, we further tested whether impulsivity mediated the effects of CT on mental health. To achieve the goals, we collected data on CT ability, CT disposition, mental health, and impulsivity from a sample of university students. The results are expected to shed light on the mechanism of the association between CT and mental health.

Participants and Procedure

A total of 314 university students (116 men) with an average age of 18.65 years ( SD = 0.67) participated in this study. They were recruited by advertisements from a local university in central China and majoring in statistics and mathematical finance. The study protocol was approved by the Human Subjects Review Committee of the Huazhong University of Science and Technology. Each participant signed a written informed consent describing the study purpose, procedure, and right of free. All the measures were administered in a computer room. The participants were tested in groups of 20–30 by two research assistants. The researchers and research assistants had no formal connections with the participants. The testing included two sections with an interval of 10 min, so that the participants had an opportunity to take a break. In the first section, the participants completed the syllogistic reasoning problems with belief bias (SRPBB), the Chinese version of the California Critical Thinking Skills Test (CCSTS-CV), and the Chinese Critical Thinking Disposition Inventory (CCTDI), respectively. In the second session, they completed the Barrett Impulsivity Scale (BIS-11), Depression Anxiety Stress Scale-21 (DASS-21), and University Personality Inventory (UPI) in the given order.

Measures of Critical Thinking Ability

The Chinese version of the California Critical Thinking Skills Test was employed to measure CT skills ( Lin, 2018 ). The CCTST is currently the most cited tool for measuring CT skills and includes analysis, assessment, deduction, inductive reasoning, and inference reasoning. The Chinese version included 34 multiple choice items. The dependent variable was the number of correctly answered items. The internal consistency (Cronbach's α) of the CCTST is 0.56 ( Jacobs, 1995 ). The test–retest reliability of CCTST-CV is 0.63 ( p < 0.01) ( Luo and Yang, 2002 ), and correlations between scores of the subscales and the total score are larger than 0.5 ( Lin, 2018 ), supporting the construct validity of the scale. In this study among the university students, the internal consistency (Cronbach's α) of the CCTST-CV was 0.5.

The second critical thinking test employed in this study was adapted from the belief bias paradigm ( Li et al., 2021 ). This task paradigm measures the ability to evaluate evidence and arguments independently of one's prior beliefs ( West et al., 2008 ), which is a strongly emphasized skill in CT literature. The current test included 20 syllogistic reasoning problems in which the logical conclusion was inconsistent with one's prior knowledge (e.g., “Premise 1: All fruits are sweet. Premise 2: Bananas are not sweet. Conclusion: Bananas are not fruits.” valid conclusion). In addition, four non-conflict items were included as the neutral condition in order to avoid a habitual response from the participants. They were instructed to suppose that all the premises are true and to decide whether the conclusion logically follows from the given premises. The measure showed good internal consistency (Cronbach's α = 0.83) in a Chinese sample ( Li et al., 2021 ). In this study, the internal consistency (Cronbach's α) of the SRPBB was 0.94.

Measures of Critical Thinking Disposition

The Chinese Critical Thinking Disposition Inventory was employed to measure CT disposition ( Peng et al., 2004 ). This scale has been developed in line with the conceptual framework of the California critical thinking disposition inventory. We measured five CT dispositions: truth-seeking (one's objectivity with findings even if this requires changing one's preconceived opinions, e.g., a person inclined toward being truth-seeking might disagree with “I believe what I want to believe.”), inquisitiveness (one's intellectual curiosity. e.g., “No matter what the topic, I am eager to know more about it”), analyticity (the tendency to use reasoning and evidence to solve problems, e.g., “It bothers me when people rely on weak arguments to defend good ideas”), systematically (the disposition of being organized and orderly in inquiry, e.g., “I always focus on the question before I attempt to answer it”), and CT self-confidence (the trust one places in one's own reasoning processes, e.g., “I appreciate my ability to think precisely”). Each disposition aspect contained 10 items, which the participants rated on a 6-point Likert-type scale. This measure has shown high internal consistency (overall Cronbach's α = 0.9) ( Peng et al., 2004 ). In this study, the CCTDI scale was assessed at Cronbach's α = 0.89, indicating good reliability.

Measure of Impulsivity

The well-known Barrett Impulsivity Scale ( Patton et al., 1995 ) was employed to assess three facets of impulsivity: non-planning impulsivity (e.g., “I plan tasks carefully”); motor impulsivity (e.g., “I act on the spur of the moment”); attentional impulsivity (e.g., “I concentrate easily”). The scale includes 30 statements, and each statement is rated on a 5-point scale. The subscales of non-planning impulsivity and attentional impulsivity were reversely scored. The BIS-11 has good internal consistency (Cronbach's α = 0.81, Velotti et al., 2016 ). This study showed that the Cronbach's α of the BIS-11 was 0.83.

Measures of Mental Health

The Depression Anxiety Stress Scale-21 was used to assess mental health problems such as depression (e.g., “I feel that life is meaningless”), anxiety (e.g., “I find myself getting agitated”), and stress (e.g., “I find it difficult to relax”). Each dimension included seven items, which the participants were asked to rate on a 4-point scale. The Chinese version of the DASS-21 has displayed a satisfactory factor structure and internal consistency (Cronbach's α = 0.92, Wang et al., 2016 ). In this study, the internal consistency (Cronbach's α) of the DASS-21 was 0.94.

The University Personality Inventory that has been commonly used to screen for mental problems of college students ( Yoshida et al., 1998 ) was also used for measuring mental health. The 56 symptom-items assessed whether an individual has experienced the described symptom during the past year (e.g., “a lack of interest in anything”). The UPI showed good internal consistency (Cronbach's α = 0.92) in a Chinese sample ( Zhang et al., 2015 ). This study showed that the Cronbach's α of the UPI was 0.85.

Statistical Analyses

We first performed analyses to detect outliers. Any observation exceeding three standard deviations from the means was replaced with a value that was three standard deviations. This procedure affected no more than 5‰ of observations. Hierarchical regression analysis was conducted to determine the extent to which facets of critical thinking were related to mental health. In addition, structural equation modeling with Amos 22.0 was performed to assess the latent relationship between CT, impulsivity, and mental health.

Descriptive Statistics and Bivariate Correlations

Table 1 presents descriptive statistics and bivariate correlations of all the variables. CT disposition such as truth-seeking, systematicity, self-confidence, and inquisitiveness was significantly correlated with DASS-21 and UPI, but neither CCTST-CV nor SRPBB was related to DASS-21 and UPI. Subscales of BIS-11 were positively correlated with DASS-21 and UPI, but were negatively associated with CT dispositions.

www.frontiersin.org

Table 1 . Descriptive results and correlations between all measured variables ( N = 314).

Regression Analyses

Hierarchical regression analyses were conducted to examine the effects of CT skill and disposition on mental health. Before conducting the analyses, scores in DASS-21 and UPI were reversed so that high scores reflected high levels of mental health. Table 2 presents the results of hierarchical regression. In model 1, the sum of the Z-score of DASS-21 and UPI served as the dependent variable. Scores in the CT ability tests and scores in the five dimensions of CCTDI served as predictors. CT skill and disposition explained 13% of the variance in mental health. CT skills did not significantly predict mental health. Two dimensions of dispositions (truth seeking and systematicity) exerted significantly positive effects on mental health. Model 2 examined whether CT predicted mental health after controlling for impulsivity. The model containing only impulsivity scores (see model-2 step 1 in Table 2 ) explained 15% of the variance in mental health. Non-planning impulsivity and motor impulsivity showed significantly negative effects on mental health. The CT variables on the second step explained a significantly unique variance (6%) of CT (see model-2 step 2). This suggests that CT skill and disposition together explained the unique variance in mental health after controlling for impulsivity. 1

www.frontiersin.org

Table 2 . Hierarchical regression models predicting mental health from critical thinking skills, critical thinking dispositions, and impulsivity ( N = 314).

Structural equation modeling was performed to examine whether impulsivity mediated the relationship between CT disposition (CT ability was not included since it did not significantly predict mental health) and mental health. Since the regression results showed that only motor impulsivity and non-planning impulsivity significantly predicted mental health, we examined two mediation models with either motor impulsivity or non-planning impulsivity as the hypothesized mediator. The item scores in the motor impulsivity subscale were randomly divided into two indicators of motor impulsivity, as were the scores in the non-planning subscale. Scores of DASS-21 and UPI served as indicators of mental health and dimensions of CCTDI as indicators of CT disposition. In addition, a bootstrapping procedure with 5,000 resamples was established to test for direct and indirect effects. Amos 22.0 was used for the above analyses.

The mediation model that included motor impulsivity (see Figure 1 ) showed an acceptable fit, χ ( 23 ) 2 = 64.71, RMSEA = 0.076, CFI = 0.96, GFI = 0.96, NNFI = 0.93, SRMR = 0.073. Mediation analyses indicated that the 95% boot confidence intervals of the indirect effect and the direct effect were (0.07, 0.26) and (−0.08, 0.32), respectively. As Hayes (2009) indicates, an effect is significant if zero is not between the lower and upper bounds in the 95% confidence interval. Accordingly, the indirect effect between CT disposition and mental health was significant, while the direct effect was not significant. Thus, motor impulsivity completely mediated the relationship between CT disposition and mental health.

www.frontiersin.org

Figure 1 . Illustration of the mediation model: Motor impulsivity as mediator variable between critical thinking dispositions and mental health. CTD-l = Truth seeking; CTD-2 = Analyticity; CTD-3 = Systematically; CTD-4 = Self-confidence; CTD-5 = Inquisitiveness. MI-I and MI-2 were sub-scores of motor impulsivity. Solid line represents significant links and dotted line non-significant links. ** p < 0.01.

The mediation model, which included non-planning impulsivity (see Figure 2 ), also showed an acceptable fit to the data, χ ( 23 ) 2 = 52.75, RMSEA = 0.064, CFI = 0.97, GFI = 0.97, NNFI = 0.95, SRMR = 0.06. The 95% boot confidence intervals of the indirect effect and the direct effect were (0.05, 0.33) and (−0.04, 0.38), respectively, indicating that non-planning impulsivity completely mediated the relationship between CT disposition and mental health.

www.frontiersin.org

Figure 2 . Illustration of the mediation model: Non-planning impulsivity asmediator variable between critical thinking dispositions and mental health. CTD-l = Truth seeking; CTD-2 = Analyticity; CTD-3 = Systematically; CTD-4 = Self-confidence; CTD-5 = Inquisitiveness. NI-I and NI-2 were sub-scores of Non-planning impulsivity. Solid line represents significant links and dotted line non-significant links. ** p < 0.01.

This study examined how critical thinking skill and disposition are related to mental health. Theories of psychotherapy suggest that human mental problems are in part due to a lack of CT. However, empirical evidence for the hypothesized relationship between CT and mental health is relatively scarce. This study explored whether and how CT ability and disposition are associated with mental health. The results, based on a university student sample, indicated that CT skill and disposition explained a unique variance in mental health. Furthermore, the effect of CT disposition on mental health was mediated by motor impulsivity and non-planning impulsivity. The finding that CT exerted a significant effect on mental health was in accordance with previous studies reporting negative correlations between CT disposition and mental disorders such as anxiety ( Suliman and Halabi, 2007 ). One reason lies in the assumption that CT disposition is usually referred to as personality traits or habits of mind that are a remarkable predictor of mental health (e.g., Benzi et al., 2019 ). This study further found that of the five CT dispositions, only truth-seeking and systematicity were associated with individual differences in mental health. This was not surprising, since the truth-seeking items mainly assess one's inclination to crave for the best knowledge in a given context and to reflect more about additional facts, reasons, or opinions, even if this requires changing one's mind about certain issues. The systematicity items target one's disposition to approach problems in an orderly and focused way. Individuals with high levels of truth-seeking and systematicity are more likely to adopt a comprehensive, reflective, and controlled way of thinking, which is what cognitive therapy aims to achieve by shifting from an automatic mode of processing to a more reflective and controlled mode.

Another important finding was that motor impulsivity and non-planning impulsivity mediated the effect of CT disposition on mental health. The reason may be that people lacking CT have less willingness to enter into a systematically analyzing process or deliberative decision-making process, resulting in more frequently rash behaviors or unplanned actions without regard for consequences ( Billieux et al., 2010 ; Franco et al., 2017 ). Such responses can potentially have tangible negative consequences (e.g., conflict, aggression, addiction) that may lead to social maladjustment that is regarded as a symptom of mental illness. On the contrary, critical thinkers have a sense of deliberativeness and consider alternate consequences before acting, and this thinking-before-acting mode would logically lead to a decrease in impulsivity, which then decreases the likelihood of problematic behaviors and negative moods.

It should be noted that although the raw correlation between attentional impulsivity and mental health was significant, regression analyses with the three dimensions of impulsivity as predictors showed that attentional impulsivity no longer exerted a significant effect on mental effect after controlling for the other impulsivity dimensions. The insignificance of this effect suggests that the significant raw correlation between attentional impulsivity and mental health was due to the variance it shared with the other impulsivity dimensions (especially with the non-planning dimension, which showed a moderately high correlation with attentional impulsivity, r = 0.67).

Some limitations of this study need to be mentioned. First, the sample involved in this study is considered as a limited sample pool, since all the participants are university students enrolled in statistics and mathematical finance, limiting the generalization of the findings. Future studies are recommended to recruit a more representative sample of university students. A study on generalization to a clinical sample is also recommended. Second, as this study was cross-sectional in nature, caution must be taken in interpreting the findings as causal. Further studies using longitudinal, controlled designs are needed to assess the effectiveness of CT intervention on mental health.

In spite of the limitations mentioned above, the findings of this study have some implications for research and practice intervention. The result that CT contributed to individual differences in mental health provides empirical support for the theory of cognitive behavioral therapy, which focuses on changing irrational thoughts. The mediating role of impulsivity between CT and mental health gives a preliminary account of the mechanism of how CT is associated with mental health. Practically, although there is evidence that CT disposition of students improves because of teaching or training interventions (e.g., Profetto-Mcgrath, 2005 ; Sanja and Krstivoje, 2015 ; Chan, 2019 ), the results showing that two CT disposition dimensions, namely, truth-seeking and systematicity, are related to mental health further suggest that special attention should be paid to cultivating these specific CT dispositions so as to enhance the control of students over impulsive behaviors in their mental health promotions.

Conclusions

This study revealed that two CT dispositions, truth-seeking and systematicity, were associated with individual differences in mental health. Furthermore, the relationship between critical thinking and mental health was mediated by motor impulsivity and non-planning impulsivity. These findings provide a preliminary account of how human critical thinking is associated with mental health. Practically, developing mental health promotion programs for university students is suggested to pay special attention to cultivating their critical thinking dispositions (especially truth-seeking and systematicity) and enhancing the control of individuals over impulsive behaviors.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

The studies involving human participants were reviewed and approved by HUST Critical Thinking Research Center (Grant No. 2018CT012). The patients/participants provided their written informed consent to participate in this study.

Author Contributions

XR designed the study and revised the manuscript. ZL collected data and wrote the manuscript. SL assisted in analyzing the data. SS assisted in re-drafting and editing the manuscript. All the authors contributed to the article and approved the submitted version.

This work was supported by the Social Science Foundation of China (grant number: BBA200034).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

1. ^ We re-analyzed the data by controlling for age and gender of the participants in the regression analyses. The results were virtually the same as those reported in the study.

Barry, A., Parvan, K., Sarbakhsh, P., Safa, B., and Allahbakhshian, A. (2020). Critical thinking in nursing students and its relationship with professional self-concept and relevant factors. Res. Dev. Med. Educ. 9, 7–7. doi: 10.34172/rdme.2020.007

CrossRef Full Text | Google Scholar

Benzi, I. M. A., Emanuele, P., Rossella, D. P., Clarkin, J. F., and Fabio, M. (2019). Maladaptive personality traits and psychological distress in adolescence: the moderating role of personality functioning. Pers. Indiv. Diff. 140, 33–40. doi: 10.1016/j.paid.2018.06.026

Billieux, J., Gay, P., Rochat, L., and Van der Linden, M. (2010). The role of urgency and its underlying psychological mechanisms in problematic behaviours. Behav. Res. Ther. 48, 1085–1096. doi: 10.1016/j.brat.2010.07.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Butler, H. A., Pentoney, C., and Bong, M. P. (2017). Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Think. Skills Creat. 25, 38–46. doi: 10.1016/j.tsc.2017.06.005

Chan, C. (2019). Using digital storytelling to facilitate critical thinking disposition in youth civic engagement: a randomized control trial. Child. Youth Serv. Rev. 107:104522. doi: 10.1016/j.childyouth.2019.104522

Chen, M. R. A., and Hwang, G. J. (2020). Effects of a concept mapping-based flipped learning approach on EFL students' English speaking performance, critical thinking awareness and speaking anxiety. Br. J. Educ. Technol. 51, 817–834. doi: 10.1111/bjet.12887

Cuijpers, P. (2019). Targets and outcomes of psychotherapies for mental disorders: anoverview. World Psychiatry. 18, 276–285. doi: 10.1002/wps.20661

Duffy, M. E., Twenge, J. M., and Joiner, T. E. (2019). Trends in mood and anxiety symptoms and suicide-related outcomes among u.s. undergraduates, 2007–2018: evidence from two national surveys. J. Adolesc. Health. 65, 590–598. doi: 10.1016/j.jadohealth.2019.04.033

Dwyer, C. P. (2017). Critical Thinking: Conceptual Perspectives and Practical Guidelines . Cambridge: Cambridge University Press.

Google Scholar

Facione, P. A. (1990). Critical Thinking: Astatement of Expert Consensus for Purposes of Educational Assessment and Instruction . Millibrae, CA: The California Academic Press.

Fortgang, R. G., Hultman, C. M., van Erp, T. G. M., and Cannon, T. D. (2016). Multidimensional assessment of impulsivity in schizophrenia, bipolar disorder, and major depressive disorder: testing for shared endophenotypes. Psychol. Med. 46, 1497–1507. doi: 10.1017/S0033291716000131

Franco, A. R., Costa, P. S., Butler, H. A., and Almeida, L. S. (2017). Assessment of undergraduates' real-world outcomes of critical thinking in everyday situations. Psychol. Rep. 120, 707–720. doi: 10.1177/0033294117701906

Gambrill, E. (2005). “Critical thinking, evidence-based practice, and mental health,” in Mental Disorders in the Social Environment: Critical Perspectives , ed S. A. Kirk (New York, NY: Columbia University Press), 247–269.

PubMed Abstract | Google Scholar

Ghanizadeh, A. (2017). The interplay between reflective thinking, critical thinking, self-monitoring, and academic achievement in higher education. Higher Educ. 74. 101–114. doi: 10.1007/s10734-016-0031-y

Gilbert, T. (2003). Some reflections on critical thinking and mental health. Teach. Phil. 24, 333–339. doi: 10.5840/teachphil200326446

Gregory, R. (1991). Critical thinking for environmental health risk education. Health Educ. Q. 18, 273–284. doi: 10.1177/109019819101800302

Hayes, A. F. (2009). Beyond Baron and Kenny: Statistical mediation analysis in the newmillennium. Commun. Monogr. 76, 408–420. doi: 10.1080/03637750903310360

Hollon, S. D., and Beck, A. T. (2013). “Cognitive and cognitive-behavioral therapies,” in Bergin and Garfield's Handbook of Psychotherapy and Behavior Change, Vol. 6 . ed M. J. Lambert (Hoboken, NJ: Wiley), 393–442.

Hong, L., Lin, C. D., Bray, M. A., and Kehle, T. J. (2010). The measurement of stressful events in chinese college students. Psychol. Sch. 42, 315–323. doi: 10.1002/pits.20082

CrossRef Full Text

Jacobs, S. S. (1995). Technical characteristics and some correlates of the california critical thinking skills test, forms a and b. Res. Higher Educ. 36, 89–108.

Kargar, F. R., Ajilchi, B., Goreyshi, M. K., and Noohi, S. (2013). Effect of creative and critical thinking skills teaching on identity styles and general health in adolescents. Proc. Soc. Behav. Sci. 84, 464–469. doi: 10.1016/j.sbspro.2013.06.585

Leclair, M. C., Lemieux, A. J., Roy, L., Martin, M. S., Latimer, E. A., and Crocker, A. G. (2020). Pathways to recovery among homeless people with mental illness: Is impulsiveness getting in the way? Can. J. Psychiatry. 65, 473–483. doi: 10.1177/0706743719885477

Lee, H. S., Kim, S., Choi, I., and Lee, K. U. (2008). Prevalence and risk factors associated with suicide ideation and attempts in korean college students. Psychiatry Investig. 5, 86–93. doi: 10.4306/pi.2008.5.2.86

Li, S., Ren, X., Schweizer, K., Brinthaupt, T. M., and Wang, T. (2021). Executive functions as predictors of critical thinking: behavioral and neural evidence. Learn. Instruct. 71:101376. doi: 10.1016/j.learninstruc.2020.101376

Lin, Y. (2018). Developing Critical Thinking in EFL Classes: An Infusion Approach . Singapore: Springer Publications.

Luo, Q. X., and Yang, X. H. (2002). Revising on Chinese version of California critical thinkingskillstest. Psychol. Sci. (Chinese). 25, 740–741.

Mamun, M. A., Misti, J. M., Hosen, I., and Mamun, F. A. (2021). Suicidal behaviors and university entrance test-related factors: a bangladeshi exploratory study. Persp. Psychiatric Care. 4, 1–10. doi: 10.1111/ppc.12783

Patton, J. H., Stanford, M. S., and Barratt, E. S. (1995). Factor structure of the Barratt impulsiveness scale. J Clin. Psychol. 51, 768–774. doi: 10.1002/1097-4679(199511)51:63.0.CO;2-1

Peng, M. C., Wang, G. C., Chen, J. L., Chen, M. H., Bai, H. H., Li, S. G., et al. (2004). Validity and reliability of the Chinese critical thinking disposition inventory. J. Nurs. China (Zhong Hua Hu Li Za Zhi). 39, 644–647.

Profetto-Mcgrath, J. (2005). Critical thinking and evidence-based practice. J. Prof. Nurs. 21, 364–371. doi: 10.1016/j.profnurs.2005.10.002

Ren, X., Tong, Y., Peng, P., and Wang, T. (2020). Critical thinking predicts academic performance beyond general cognitive ability: evidence from adults and children. Intelligence 82:101487. doi: 10.1016/j.intell.2020.101487

Sanja, M., and Krstivoje, S. (2015). Developing critical thinking in elementary mathematics education through a suitable selection of content and overall student performance. Proc. Soc. Behav. Sci. 180, 653–659. doi: 10.1016/j.sbspro.2015.02.174

Stupple, E., Maratos, F. A., Elander, J., Hunt, T. E., and Aubeeluck, A. V. (2017). Development of the critical thinking toolkit (critt): a measure of student attitudes and beliefs about critical thinking. Think. Skills Creat. 23, 91–100. doi: 10.1016/j.tsc.2016.11.007

Suliman, W. A., and Halabi, J. (2007). Critical thinking, self-esteem, and state anxiety of nursing students. Nurse Educ. Today. 27, 162–168. doi: 10.1016/j.nedt.2006.04.008

Ugwuozor, F. O., Otu, M. S., and Mbaji, I. N. (2021). Critical thinking intervention for stress reduction among undergraduates in the Nigerian Universities. Medicine 100:25030. doi: 10.1097/MD.0000000000025030

Velotti, P., Garofalo, C., Petrocchi, C., Cavallo, F., Popolo, R., and Dimaggio, G. (2016). Alexithymia, emotion dysregulation, impulsivity and aggression: a multiple mediation model. Psychiatry Res. 237, 296–303. doi: 10.1016/j.psychres.2016.01.025

Wang, K., Shi, H. S., Geng, F. L., Zou, L. Q., Tan, S. P., Wang, Y., et al. (2016). Cross-cultural validation of the depression anxiety stress scale−21 in China. Psychol. Assess. 28:e88. doi: 10.1037/pas0000207

West, R. F., Toplak, M. E., and Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J. Educ. Psychol. 100, 930–941. doi: 10.1037/a0012842

Yoshida, T., Ichikawa, T., Ishikawa, T., and Hori, M. (1998). Mental health of visually and hearing impaired students from the viewpoint of the University Personality Inventory. Psychiatry Clin. Neurosci. 52, 413–418.

Yoshinori, S., and Marcus, G. (2013). The dual effects of critical thinking disposition on worry. PLoS ONE 8:e79714. doi: 10.1371/journal.pone.007971

Zhang, J., Lanza, S., Zhang, M., and Su, B. (2015). Structure of the University personality inventory for chinese college students. Psychol. Rep. 116, 821–839. doi: 10.2466/08.02.PR0.116k26w3

Keywords: mental health, critical thinking ability, critical thinking disposition, impulsivity, depression

Citation: Liu Z, Li S, Shang S and Ren X (2021) How Do Critical Thinking Ability and Critical Thinking Disposition Relate to the Mental Health of University Students? Front. Psychol. 12:704229. doi: 10.3389/fpsyg.2021.704229

Received: 04 May 2021; Accepted: 21 July 2021; Published: 19 August 2021.

Reviewed by:

Copyright © 2021 Liu, Li, Shang and Ren. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Xuezhu Ren, renxz@hust.edu.cn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

University of Louisville

  • Programs & Services
  • Delphi Center

Ideas to Action (i2a)

  • What is Critical Thinking?

The ability to think critically calls for a higher-order thinking than simply the ability to recall information.

Definitions of critical thinking, its elements, and its associated activities fill the educational literature of the past forty years. Critical thinking has been described as an ability to question; to acknowledge and test previously held assumptions; to recognize ambiguity; to examine, interpret, evaluate, reason, and reflect; to make informed judgments and decisions; and to clarify, articulate, and justify positions (Hullfish & Smith, 1961; Ennis, 1962; Ruggiero, 1975; Scriven, 1976; Hallet, 1984; Kitchener, 1986; Pascarella & Terenzini, 1991; Mines et al., 1990; Halpern, 1996; Paul & Elder, 2001; Petress, 2004; Holyoak & Morrison, 2005; among others).

After a careful review of the mountainous body of literature defining critical thinking and its elements, UofL has chosen to adopt the language of Michael Scriven and Richard Paul (2003) as a comprehensive, concise operating definition:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.

Paul and Scriven go on to suggest that critical thinking is based on: "universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness. It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue, assumptions, concepts, empirical grounding; reasoning leading to conclusions, implication and consequences, objections from alternative viewpoints, and frame of reference. Critical thinking - in being responsive to variable subject matter, issues, and purposes - is incorporated in a family of interwoven modes of thinking, among them: scientific thinking, mathematical thinking, historical thinking, anthropological thinking, economic thinking, moral thinking, and philosophical thinking."

This conceptualization of critical thinking has been refined and developed further by Richard Paul and Linder Elder into the Paul-Elder framework of critical thinking. Currently, this approach is one of the most widely published and cited frameworks in the critical thinking literature. According to the Paul-Elder framework, critical thinking is the:

  • Analysis of thinking by focusing on the parts or structures of thinking ("the Elements of Thought")
  • Evaluation of thinking by focusing on the quality ("the Universal Intellectual Standards")
  • Improvement of thinking by using what you have learned ("the Intellectual Traits")

Selection of a Critical Thinking Framework

The University of Louisville chose the Paul-Elder model of Critical Thinking as the approach to guide our efforts in developing and enhancing our critical thinking curriculum. The Paul-Elder framework was selected based on criteria adapted from the characteristics of a good model of critical thinking developed at Surry Community College. The Paul-Elder critical thinking framework is comprehensive, uses discipline-neutral terminology, is applicable to all disciplines, defines specific cognitive skills including metacognition, and offers high quality resources.

Why the selection of a single critical thinking framework?

The use of a single critical thinking framework is an important aspect of institution-wide critical thinking initiatives (Paul and Nosich, 1993; Paul, 2004). According to this view, critical thinking instruction should not be relegated to one or two disciplines or departments with discipline specific language and conceptualizations. Rather, critical thinking instruction should be explicitly infused in all courses so that critical thinking skills can be developed and reinforced in student learning across the curriculum. The use of a common approach with a common language allows for a central organizer and for the development of critical thinking skill sets in all courses.

  • SACS & QEP
  • Planning and Implementation
  • Why Focus on Critical Thinking?
  • Paul-Elder Critical Thinking Framework
  • Culminating Undergraduate Experience
  • Community Engagement
  • Frequently Asked Questions
  • What is i2a?

Copyright © 2012 - University of Louisville , Delphi Center

Library Home

Introduction to Logic and Critical Thinking

(10 reviews)

example of consistency in critical thinking

Matthew Van Cleave, Lansing Community College

Copyright Year: 2016

Publisher: Matthew J. Van Cleave

Language: English

Formats Available

Conditions of use.

Attribution

Learn more about reviews.

Reviewed by "yusef" Alexander Hayes, Professor, North Shore Community College on 6/9/21

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness. read more

Comprehensiveness rating: 5 see less

Formal and informal reasoning, argument structure, and fallacies are covered comprehensively, meeting the author's goal of both depth and succinctness.

Content Accuracy rating: 5

The book is accurate.

Relevance/Longevity rating: 5

While many modern examples are used, and they are helpful, they are not necessarily needed. The usefulness of logical principles and skills have proved themselves, and this text presents them clearly with many examples.

Clarity rating: 5

It is obvious that the author cares about their subject, audience, and students. The text is comprehensible and interesting.

Consistency rating: 5

The format is easy to understand and is consistent in framing.

Modularity rating: 5

This text would be easy to adapt.

Organization/Structure/Flow rating: 5

The organization is excellent, my one suggestion would be a concluding chapter.

Interface rating: 5

I accessed the PDF version and it would be easy to work with.

Grammatical Errors rating: 5

The writing is excellent.

Cultural Relevance rating: 5

This is not an offensive text.

Reviewed by Susan Rottmann, Part-time Lecturer, University of Southern Maine on 3/2/21

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it... read more

Comprehensiveness rating: 4 see less

I reviewed this book for a course titled "Creative and Critical Inquiry into Modern Life." It won't meet all my needs for that course, but I haven't yet found a book that would. I wanted to review this one because it states in the preface that it fits better for a general critical thinking course than for a true logic course. I'm not sure that I'd agree. I have been using Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," and I think that book is a better introduction to critical thinking for non-philosophy majors. However, the latter is not open source so I will figure out how to get by without it in the future. Overall, the book seems comprehensive if the subject is logic. The index is on the short-side, but fine. However, one issue for me is that there are no page numbers on the table of contents, which is pretty annoying if you want to locate particular sections.

Content Accuracy rating: 4

I didn't find any errors. In general the book uses great examples. However, they are very much based in the American context, not for an international student audience. Some effort to broaden the chosen examples would make the book more widely applicable.

Relevance/Longevity rating: 4

I think the book will remain relevant because of the nature of the material that it addresses, however there will be a need to modify the examples in future editions and as the social and political context changes.

Clarity rating: 3

The text is lucid, but I think it would be difficult for introductory-level students who are not philosophy majors. For example, in Browne and Keeley's "Asking the Right Questions: A Guide to Critical Thinking," the sub-headings are very accessible, such as "Experts cannot rescue us, despite what they say" or "wishful thinking: perhaps the biggest single speed bump on the road to critical thinking." By contrast, Van Cleave's "Introduction to Logic and Critical Thinking" has more subheadings like this: "Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form" or "Propositional logic and the four basic truth functional connectives." If students are prepared very well for the subject, it would work fine, but for students who are newly being introduced to critical thinking, it is rather technical.

It seems to be very consistent in terms of its terminology and framework.

Modularity rating: 4

The book is divided into 4 chapters, each having many sub-chapters. In that sense, it is readily divisible and modular. However, as noted above, there are no page numbers on the table of contents, which would make assigning certain parts rather frustrating. Also, I'm not sure why the book is only four chapter and has so many subheadings (for instance 17 in Chapter 2) and a length of 242 pages. Wouldn't it make more sense to break up the book into shorter chapters? I think this would make it easier to read and to assign in specific blocks to students.

Organization/Structure/Flow rating: 4

The organization of the book is fine overall, although I think adding page numbers to the table of contents and breaking it up into more separate chapters would help it to be more easily navigable.

Interface rating: 4

The book is very simply presented. In my opinion it is actually too simple. There are few boxes or diagrams that highlight and explain important points.

The text seems fine grammatically. I didn't notice any errors.

The book is written with an American audience in mind, but I did not notice culturally insensitive or offensive parts.

Overall, this book is not for my course, but I think it could work well in a philosophy course.

example of consistency in critical thinking

Reviewed by Daniel Lee, Assistant Professor of Economics and Leadership, Sweet Briar College on 11/11/19

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument read more

Comprehensiveness rating: 3 see less

This textbook is not particularly comprehensive (4 chapters long), but I view that as a benefit. In fact, I recommend it for use outside of traditional logic classes, but rather interdisciplinary classes that evaluate argument

To the best of my ability, I regard this content as accurate, error-free, and unbiased

The book is broadly relevant and up-to-date, with a few stray temporal references (sydney olympics, particular presidencies). I don't view these time-dated examples as problematic as the logical underpinnings are still there and easily assessed

Clarity rating: 4

My only pushback on clarity is I didn't find the distinction between argument and explanation particularly helpful/useful/easy to follow. However, this experience may have been unique to my class.

To the best of my ability, I regard this content as internally consistent

I found this text quite modular, and was easily able to integrate other texts into my lessons and disregard certain chapters or sub-sections

The book had a logical and consistent structure, but to the extent that there are only 4 chapters, there isn't much scope for alternative approaches here

No problems with the book's interface

The text is grammatically sound

Cultural Relevance rating: 4

Perhaps the text could have been more universal in its approach. While I didn't find the book insensitive per-se, logic can be tricky here because the point is to evaluate meaningful (non-trivial) arguments, but any argument with that sense of gravity can also be traumatic to students (abortion, death penalty, etc)

No additional comments

Reviewed by Lisa N. Thomas-Smith, Graduate Part-time Instructor, CU Boulder on 7/1/19

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text,... read more

The text covers all the relevant technical aspects of introductory logic and critical thinking, and covers them well. A separate glossary would be quite helpful to students. However, the terms are clearly and thoroughly explained within the text, and the index is very thorough.

The content is excellent. The text is thorough and accurate with no errors that I could discern. The terminology and exercises cover the material nicely and without bias.

The text should easily stand the test of time. The exercises are excellent and would be very helpful for students to internalize correct critical thinking practices. Because of the logical arrangement of the text and the many sub-sections, additional material should be very easy to add.

The text is extremely clearly and simply written. I anticipate that a diligent student could learn all of the material in the text with little additional instruction. The examples are relevant and easy to follow.

The text did not confuse terms or use inconsistent terminology, which is very important in a logic text. The discipline often uses multiple terms for the same concept, but this text avoids that trap nicely.

The text is fairly easily divisible. Since there are only four chapters, those chapters include large blocks of information. However, the chapters themselves are very well delineated and could be easily broken up so that parts could be left out or covered in a different order from the text.

The flow of the text is excellent. All of the information is handled solidly in an order that allows the student to build on the information previously covered.

The PDF Table of Contents does not include links or page numbers which would be very helpful for navigation. Other than that, the text was very easy to navigate. All the images, charts, and graphs were very clear

I found no grammatical errors in the text.

Cultural Relevance rating: 3

The text including examples and exercises did not seem to be offensive or insensitive in any specific way. However, the examples included references to black and white people, but few others. Also, the text is very American specific with many examples from and for an American audience. More diversity, especially in the examples, would be appropriate and appreciated.

Reviewed by Leslie Aarons, Associate Professor of Philosophy, CUNY LaGuardia Community College on 5/16/19

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an... read more

This is an excellent introductory (first-year) Logic and Critical Thinking textbook. The book covers the important elementary information, clearly discussing such things as the purpose and basic structure of an argument; the difference between an argument and an explanation; validity; soundness; and the distinctions between an inductive and a deductive argument in accessible terms in the first chapter. It also does a good job introducing and discussing informal fallacies (Chapter 4). The incorporation of opportunities to evaluate real-world arguments is also very effective. Chapter 2 also covers a number of formal methods of evaluating arguments, such as Venn Diagrams and Propositional logic and the four basic truth functional connectives, but to my mind, it is much more thorough in its treatment of Informal Logic and Critical Thinking skills, than it is of formal logic. I also appreciated that Van Cleave’s book includes exercises with answers and an index, but there is no glossary; which I personally do not find detracts from the book's comprehensiveness.

Overall, Van Cleave's book is error-free and unbiased. The language used is accessible and engaging. There were no glaring inaccuracies that I was able to detect.

Van Cleave's Textbook uses relevant, contemporary content that will stand the test of time, at least for the next few years. Although some examples use certain subjects like former President Obama, it does so in a useful manner that inspires the use of critical thinking skills. There are an abundance of examples that inspire students to look at issues from many different political viewpoints, challenging students to practice evaluating arguments, and identifying fallacies. Many of these exercises encourage students to critique issues, and recognize their own inherent reader-biases and challenge their own beliefs--hallmarks of critical thinking.

As mentioned previously, the author has an accessible style that makes the content relatively easy to read and engaging. He also does a suitable job explaining jargon/technical language that is introduced in the textbook.

Van Cleave uses terminology consistently and the chapters flow well. The textbook orients the reader by offering effective introductions to new material, step-by-step explanations of the material, as well as offering clear summaries of each lesson.

This textbook's modularity is really quite good. Its language and structure are not overly convoluted or too-lengthy, making it convenient for individual instructors to adapt the materials to suit their methodological preferences.

The topics in the textbook are presented in a logical and clear fashion. The structure of the chapters are such that it is not necessary to have to follow the chapters in their sequential order, and coverage of material can be adapted to individual instructor's preferences.

The textbook is free of any problematic interface issues. Topics, sections and specific content are accessible and easy to navigate. Overall it is user-friendly.

I did not find any significant grammatical issues with the textbook.

The textbook is not culturally insensitive, making use of a diversity of inclusive examples. Materials are especially effective for first-year critical thinking/logic students.

I intend to adopt Van Cleave's textbook for a Critical Thinking class I am teaching at the Community College level. I believe that it will help me facilitate student-learning, and will be a good resource to build additional classroom activities from the materials it provides.

Reviewed by Jennie Harrop, Chair, Department of Professional Studies, George Fox University on 3/27/18

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters... read more

While the book is admirably comprehensive, its extensive details within a few short chapters may feel overwhelming to students. The author tackles an impressive breadth of concepts in Chapter 1, 2, 3, and 4, which leads to 50-plus-page chapters that are dense with statistical analyses and critical vocabulary. These topics are likely better broached in manageable snippets rather than hefty single chapters.

The ideas addressed in Introduction to Logic and Critical Thinking are accurate but at times notably political. While politics are effectively used to exemplify key concepts, some students may be distracted by distinct political leanings.

The terms and definitions included are relevant, but the examples are specific to the current political, cultural, and social climates, which could make the materials seem dated in a few years without intentional and consistent updates.

While the reasoning is accurate, the author tends to complicate rather than simplify -- perhaps in an effort to cover a spectrum of related concepts. Beginning readers are likely to be overwhelmed and under-encouraged by his approach.

Consistency rating: 3

The four chapters are somewhat consistent in their play of definition, explanation, and example, but the structure of each chapter varies according to the concepts covered. In the third chapter, for example, key ideas are divided into sub-topics numbering from 3.1 to 3.10. In the fourth chapter, the sub-divisions are further divided into sub-sections numbered 4.1.1-4.1.5, 4.2.1-4.2.2, and 4.3.1 to 4.3.6. Readers who are working quickly to master new concepts may find themselves mired in similarly numbered subheadings, longing for a grounded concepts on which to hinge other key principles.

Modularity rating: 3

The book's four chapters make it mostly self-referential. The author would do well to beak this text down into additional subsections, easing readers' accessibility.

The content of the book flows logically and well, but the information needs to be better sub-divided within each larger chapter, easing the student experience.

The book's interface is effective, allowing readers to move from one section to the next with a single click. Additional sub-sections would ease this interplay even further.

Grammatical Errors rating: 4

Some minor errors throughout.

For the most part, the book is culturally neutral, avoiding direct cultural references in an effort to remain relevant.

Reviewed by Yoichi Ishida, Assistant Professor of Philosophy, Ohio University on 2/1/18

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic,... read more

This textbook covers enough topics for a first-year course on logic and critical thinking. Chapter 1 covers the basics as in any standard textbook in this area. Chapter 2 covers propositional logic and categorical logic. In propositional logic, this textbook does not cover suppositional arguments, such as conditional proof and reductio ad absurdum. But other standard argument forms are covered. Chapter 3 covers inductive logic, and here this textbook introduces probability and its relationship with cognitive biases, which are rarely discussed in other textbooks. Chapter 4 introduces common informal fallacies. The answers to all the exercises are given at the end. However, the last set of exercises is in Chapter 3, Section 5. There are no exercises in the rest of the chapter. Chapter 4 has no exercises either. There is index, but no glossary.

The textbook is accurate.

The content of this textbook will not become obsolete soon.

The textbook is written clearly.

The textbook is internally consistent.

The textbook is fairly modular. For example, Chapter 3, together with a few sections from Chapter 1, can be used as a short introduction to inductive logic.

The textbook is well-organized.

There are no interface issues.

I did not find any grammatical errors.

This textbook is relevant to a first semester logic or critical thinking course.

Reviewed by Payal Doctor, Associate Professro, LaGuardia Community College on 2/1/18

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner... read more

This text is a beginner textbook for arguments and propositional logic. It covers the basics of identifying arguments, building arguments, and using basic logic to construct propositions and arguments. It is quite comprehensive for a beginner book, but seems to be a good text for a course that needs a foundation for arguments. There are exercises on creating truth tables and proofs, so it could work as a logic primer in short sessions or with the addition of other course content.

The books is accurate in the information it presents. It does not contain errors and is unbiased. It covers the essential vocabulary clearly and givens ample examples and exercises to ensure the student understands the concepts

The content of the book is up to date and can be easily updated. Some examples are very current for analyzing the argument structure in a speech, but for this sort of text understandable examples are important and the author uses good examples.

The book is clear and easy to read. In particular, this is a good text for community college students who often have difficulty with reading comprehension. The language is straightforward and concepts are well explained.

The book is consistent in terminology, formatting, and examples. It flows well from one topic to the next, but it is also possible to jump around the text without loosing the voice of the text.

The books is broken down into sub units that make it easy to assign short blocks of content at a time. Later in the text, it does refer to a few concepts that appear early in that text, but these are all basic concepts that must be used to create a clear and understandable text. No sections are too long and each section stays on topic and relates the topic to those that have come before when necessary.

The flow of the text is logical and clear. It begins with the basic building blocks of arguments, and practice identifying more and more complex arguments is offered. Each chapter builds up from the previous chapter in introducing propositional logic, truth tables, and logical arguments. A select number of fallacies are presented at the end of the text, but these are related to topics that were presented before, so it makes sense to have these last.

The text is free if interface issues. I used the PDF and it worked fine on various devices without loosing formatting.

1. The book contains no grammatical errors.

The text is culturally sensitive, but examples used are a bit odd and may be objectionable to some students. For instance, President Obama's speech on Syria is used to evaluate an extended argument. This is an excellent example and it is explained well, but some who disagree with Obama's policies may have trouble moving beyond their own politics. However, other examples look at issues from all political viewpoints and ask students to evaluate the argument, fallacy, etc. and work towards looking past their own beliefs. Overall this book does use a variety of examples that most students can understand and evaluate.

My favorite part of this book is that it seems to be written for community college students. My students have trouble understanding readings in the New York Times, so it is nice to see a logic and critical thinking text use real language that students can understand and follow without the constant need of a dictionary.

Reviewed by Rebecca Owen, Adjunct Professor, Writing, Chemeketa Community College on 6/20/17

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current... read more

This textbook is quite thorough--there are conversational explanations of argument structure and logic. I think students will be happy with the conversational style this author employs. Also, there are many examples and exercises using current events, funny scenarios, or other interesting ways to evaluate argument structure and validity. The third section, which deals with logical fallacies, is very clear and comprehensive. My only critique of the material included in the book is that the middle section may be a bit dense and math-oriented for learners who appreciate the more informal, informative style of the first and third section. Also, the book ends rather abruptly--it moves from a description of a logical fallacy to the answers for the exercises earlier in the text.

The content is very reader-friendly, and the author writes with authority and clarity throughout the text. There are a few surface-level typos (Starbuck's instead of Starbucks, etc.). None of these small errors detract from the quality of the content, though.

One thing I really liked about this text was the author's wide variety of examples. To demonstrate different facets of logic, he used examples from current media, movies, literature, and many other concepts that students would recognize from their daily lives. The exercises in this text also included these types of pop-culture references, and I think students will enjoy the familiarity--as well as being able to see the logical structures behind these types of references. I don't think the text will need to be updated to reflect new instances and occurrences; the author did a fine job at picking examples that are relatively timeless. As far as the subject matter itself, I don't think it will become obsolete any time soon.

The author writes in a very conversational, easy-to-read manner. The examples used are quite helpful. The third section on logical fallacies is quite easy to read, follow, and understand. A student in an argument writing class could benefit from this section of the book. The middle section is less clear, though. A student learning about the basics of logic might have a hard time digesting all of the information contained in chapter two. This material might be better in two separate chapters. I think the author loses the balance of a conversational, helpful tone and focuses too heavily on equations.

Consistency rating: 4

Terminology in this book is quite consistent--the key words are highlighted in bold. Chapters 1 and 3 follow a similar organizational pattern, but chapter 2 is where the material becomes more dense and equation-heavy. I also would have liked a closing passage--something to indicate to the reader that we've reached the end of the chapter as well as the book.

I liked the overall structure of this book. If I'm teaching an argumentative writing class, I could easily point the students to the chapters where they can identify and practice identifying fallacies, for instance. The opening chapter is clear in defining the necessary terms, and it gives the students an understanding of the toolbox available to them in assessing and evaluating arguments. Even though I found the middle section to be dense, smaller portions could be assigned.

The author does a fine job connecting each defined term to the next. He provides examples of how each defined term works in a sentence or in an argument, and then he provides practice activities for students to try. The answers for each question are listed in the final pages of the book. The middle section feels like the heaviest part of the whole book--it would take the longest time for a student to digest if assigned the whole chapter. Even though this middle section is a bit heavy, it does fit the overall structure and flow of the book. New material builds on previous chapters and sub-chapters. It ends abruptly--I didn't realize that it had ended, and all of a sudden I found myself in the answer section for those earlier exercises.

The simple layout is quite helpful! There is nothing distracting, image-wise, in this text. The table of contents is clearly arranged, and each topic is easy to find.

Tiny edits could be made (Starbuck's/Starbucks, for one). Otherwise, it is free of distracting grammatical errors.

This text is quite culturally relevant. For instance, there is one example that mentions the rumors of Barack Obama's birthplace as somewhere other than the United States. This example is used to explain how to analyze an argument for validity. The more "sensational" examples (like the Obama one above) are helpful in showing argument structure, and they can also help students see how rumors like this might gain traction--as well as help to show students how to debunk them with their newfound understanding of argument and logic.

The writing style is excellent for the subject matter, especially in the third section explaining logical fallacies. Thank you for the opportunity to read and review this text!

Reviewed by Laurel Panser, Instructor, Riverland Community College on 6/20/17

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as... read more

This is a review of Introduction to Logic and Critical Thinking, an open source book version 1.4 by Matthew Van Cleave. The comparison book used was Patrick J. Hurley’s A Concise Introduction to Logic 12th Edition published by Cengage as well as the 13th edition with the same title. Lori Watson is the second author on the 13th edition.

Competing with Hurley is difficult with respect to comprehensiveness. For example, Van Cleave’s book is comprehensive to the extent that it probably covers at least two-thirds or more of what is dealt with in most introductory, one-semester logic courses. Van Cleave’s chapter 1 provides an overview of argumentation including discerning non-arguments from arguments, premises versus conclusions, deductive from inductive arguments, validity, soundness and more. Much of Van Cleave’s chapter 1 parallel’s Hurley’s chapter 1. Hurley’s chapter 3 regarding informal fallacies is comprehensive while Van Cleave’s chapter 4 on this topic is less extensive. Categorical propositions are a topic in Van Cleave’s chapter 2; Hurley’s chapters 4 and 5 provide more instruction on this, however. Propositional logic is another topic in Van Cleave’s chapter 2; Hurley’s chapters 6 and 7 provide more information on this, though. Van Cleave did discuss messy issues of language meaning briefly in his chapter 1; that is the topic of Hurley’s chapter 2.

Van Cleave’s book includes exercises with answers and an index. A glossary was not included.

Reviews of open source textbooks typically include criteria besides comprehensiveness. These include comments on accuracy of the information, whether the book will become obsolete soon, jargon-free clarity to the extent that is possible, organization, navigation ease, freedom from grammar errors and cultural relevance; Van Cleave’s book is fine in all of these areas. Further criteria for open source books includes modularity and consistency of terminology. Modularity is defined as including blocks of learning material that are easy to assign to students. Hurley’s book has a greater degree of modularity than Van Cleave’s textbook. The prose Van Cleave used is consistent.

Van Cleave’s book will not become obsolete soon.

Van Cleave’s book has accessible prose.

Van Cleave used terminology consistently.

Van Cleave’s book has a reasonable degree of modularity.

Van Cleave’s book is organized. The structure and flow of his book is fine.

Problems with navigation are not present.

Grammar problems were not present.

Van Cleave’s book is culturally relevant.

Van Cleave’s book is appropriate for some first semester logic courses.

Table of Contents

Chapter 1: Reconstructing and analyzing arguments

  • 1.1 What is an argument?
  • 1.2 Identifying arguments
  • 1.3 Arguments vs. explanations
  • 1.4 More complex argument structures
  • 1.5 Using your own paraphrases of premises and conclusions to reconstruct arguments in standard form
  • 1.6 Validity
  • 1.7 Soundness
  • 1.8 Deductive vs. inductive arguments
  • 1.9 Arguments with missing premises
  • 1.10 Assuring, guarding, and discounting
  • 1.11 Evaluative language
  • 1.12 Evaluating a real-life argument

Chapter 2: Formal methods of evaluating arguments

  • 2.1 What is a formal method of evaluation and why do we need them?
  • 2.2 Propositional logic and the four basic truth functional connectives
  • 2.3 Negation and disjunction
  • 2.4 Using parentheses to translate complex sentences
  • 2.5 “Not both” and “neither nor”
  • 2.6 The truth table test of validity
  • 2.7 Conditionals
  • 2.8 “Unless”
  • 2.9 Material equivalence
  • 2.10 Tautologies, contradictions, and contingent statements
  • 2.11 Proofs and the 8 valid forms of inference
  • 2.12 How to construct proofs
  • 2.13 Short review of propositional logic
  • 2.14 Categorical logic
  • 2.15 The Venn test of validity for immediate categorical inferences
  • 2.16 Universal statements and existential commitment
  • 2.17 Venn validity for categorical syllogisms

Chapter 3: Evaluating inductive arguments and probabilistic and statistical fallacies

  • 3.1 Inductive arguments and statistical generalizations
  • 3.2 Inference to the best explanation and the seven explanatory virtues
  • 3.3 Analogical arguments
  • 3.4 Causal arguments
  • 3.5 Probability
  • 3.6 The conjunction fallacy
  • 3.7 The base rate fallacy
  • 3.8 The small numbers fallacy
  • 3.9 Regression to the mean fallacy
  • 3.10 Gambler's fallacy

Chapter 4: Informal fallacies

  • 4.1 Formal vs. informal fallacies
  • 4.1.1 Composition fallacy
  • 4.1.2 Division fallacy
  • 4.1.3 Begging the question fallacy
  • 4.1.4 False dichotomy
  • 4.1.5 Equivocation
  • 4.2 Slippery slope fallacies
  • 4.2.1 Conceptual slippery slope
  • 4.2.2 Causal slippery slope
  • 4.3 Fallacies of relevance
  • 4.3.1 Ad hominem
  • 4.3.2 Straw man
  • 4.3.3 Tu quoque
  • 4.3.4 Genetic
  • 4.3.5 Appeal to consequences
  • 4.3.6 Appeal to authority

Answers to exercises Glossary/Index

Ancillary Material

About the book.

This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a “critical thinking textbook.”

About the Contributors

Matthew Van Cleave ,   PhD, Philosophy, University of Cincinnati, 2007.  VAP at Concordia College (Moorhead), 2008-2012.  Assistant Professor at Lansing Community College, 2012-2016. Professor at Lansing Community College, 2016-

Contribute to this Page

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

7.5: Logical Analysis using Truth Tables

  • Last updated
  • Save as PDF
  • Page ID 223899

Truth tables, which we introduced in the last section, are primarily useful in a different way. Instead of just using them to tell us what the different logical operators mean, we can use them to do some in depth analysis of the logical form of a statement, a set of statements, or an inference.

What’s a logical analysis? A logical analysis is a set of things we can do to learn something about a particular logical structure. The statement “I’ll go if you don’t go or if we can get a babysitter and it’s not too expensive for both of us to go” has a particular logical form—something like

[(B \(\wedge\) ~E) \(\rightarrow\) I] . If we do a logical analysis of this logical form, we’ll find out certain things. For instance: when is this statement true? That is, what must the world be like for this statement to turn out to be true? Is this statement always true? Is it always false? What is its relationship with other similar logical forms like [~(B \(\wedge\) E) \(\rightarrow\) I] ? [~(B \(\wedge\) ~E) \(\rightarrow\) ~I] ? [(B \(\wedge\) E) \(\rightarrow\) ~I] ? Is it possible that all of these statements could be true at the same time? That is, are they consistent? We can find out the answers to all of these questions using Truth Tables.

Conceptually speaking, we’re doing the following when we build a truth table:

1. Collecting all of the possible combinations of truth values and listing them out.

  • That is, we’re finding out all of the different ways that T and F can be combined for our atomic sentence letters.

2. Using each set of possible truth values to calculate the output truth value for a complex formula (or a set of complex formulas).

  • That is, we’re doing what we did in Computing Truth Values : we’re taking the truth values of atomic sentence letters as an input and calculating the single truth value output.

3. Analyzing the results.

  • That is, we’re looking at the resulting output and trying to figure out what it tells us about the logical formulas in question.

The goal is to see what possible conditions of the world (what combinations of true and false for the atomic propositions, each of which either describes the world correctly or incorrectly) give what sorts of truth values for the complex formulas we’re analyzing and then to look for patterns in the outputs to tell us something about the individual statement, the set of statements, or the argument we’re analyzing.

That’s not terribly illuminating, though, about how to actually go about building a truth table. Let’s look more concretely at what to do.

First, I must point out a new notation that will be unfamiliar to you. Well, you will have seen it before lots of times, but not here in logic. The forward slash!

If you see this:

[\(\neg\)[D \(\leftrightarrow\) \(\neg\)(X\(\rightarrow\) (Z \(\bullet\) Q))] \(\vee\) P] / (X\(\rightarrow\) (Z \(\bullet\) Q))]

That means there are two formulas here:

[\(\neg\)[D \(\leftrightarrow\) \(\neg\)(X\(\rightarrow\) (Z \(\bullet\) Q))] \(\vee\) P]

(X\(\rightarrow\) (Z \(\bullet\) Q))]

The forward slash (/) has no logical meaning. It simply separates formulas from one another so we can list them on a single line without it seeming like they are parts of the same formula.

Building a Truth Table

Building a truth table is very straightforward, but that doesn’t mean it’s not going to take some getting used to. Let’s divide it into a series of steps.

Step One: Figure out what size you need

How many unique sentence letters are in the formula or set of formulas? Just count ‘em up, counting each unique letter only once (so two B’s just count as one). Here are some examples:

  • [~(B\(\wedge\)E)\(\rightarrow\)I] has 3 unique letters: B, E, and I
  • [~(B\(\wedge\)E)\(\rightarrow\)B] has 2 unique letter: B and E
  • [~(I\(\wedge\)I)\(\rightarrow\)I] has 1 unique letter: I
  • [~(B\(\wedge\)E)\(\rightarrow\)I] / [(B\(\wedge\)E)\(\rightarrow\)~I] have 3 unique letters all together: B, E, and I
  • [\(\neg\)[D \(\leftrightarrow\) \(\neg\)(X\(\rightarrow\) (Z \(\bullet\) Q))] \(\vee\) P] / Z / (P \(\bullet\) Q) has 5 unique letters: D, X, Z, Q, and P

Once you’ve figure out this magic number, you plug it into a magic formula: 2 n , where n is the magic number: the number of unique sentence letters in the set of propositions the truth table is for. The result of this mathematical formula is the number of rows you’ll need in your truth table.

Here are a set of truth table sizes:

Notice the pattern? The nice thing is that chances are, your instructor will only assign at most a 4-letter truth table, so 16 rows is the absolute most you’ll usually need to work with. Most instructors stick to 1, 2, and 3-letter truth tables.

The downside of truth tables is that it doesn’t take long before you have to start making tables with 32, 64, 128, 256, 512, 65,536 rows! That’s too much to really make making a truth table worthwhile. This is called the problem of Combinatorial Explosion because all of the combinations that are possible “explode” to astronomical numbers . Nevertheless, they are quite useful for relatively simple problems.

Step Two: Make a truth table

A truth table is a table with a column for each unique sentence letter (usually in the order in which they show up in the formulas you are analyzing) and then a column for each formula you are analyzing. Here are some examples:

Notice how each column gets its own label and there’s a bigger border separating the individual sentence letters (the inputs) from the complex formulas (the outputs). So far so good.

Step Three: Fill in the input side

This step is always the same no matter what the columns labels are. There are basically 3 different tables you’ll make, and for bigger tables you can extrapolate the same basic pattern. Remember that the goal when filling in the input side is to make a list of all of the possible combinations of the two truth values T and F.

So when we only have one unique sentence letter. The two possibilities are that the letter is True and the letter is False:

It’s a bit more complicated if we have two unique sentence letters:

Each row in the table is a possible set of truth values. The possible combos are TT, TF, FT, and FF. See how that works?

Before moving onto an eight row truth table, let’s think about the pattern here. We’ve started on the right nearest the thick line separating the inputs from the outputs and alternated going down: TFTF. For the first table, we alternated too, but just had to do it once: TF. On the second table, we alternated TF going down the R column, and then just repeated so the downward pattern would be TFTF.

Then we moved left and alternated every 2, so the Q column reads TTFF. Why would we do this? Well, we have R is true and R is false, then we need to test for when Q is True with both of these possibilities, and then test again for when Q is false. The result is something like the following:

It may even be helpful to think of it in these terms:

6.5.1.PNG

This way, we’re getting all the possible combinations of Q:true, Q:false, R:true, and R:false. If this little explanation is confusing for you, it’s probably best to move on. Perhaps it will make more sense later, and even if it doesn’t, it’s okay since this is sort of conceptual background work rather than something that is vital to understanding the truth table. What you need to understand at minimum is simply that by following this procedure, you’re creating all of the possible combinations of T and F for the atomic sentence letters in the formulas you are trying to analyze using the truth table.

Now let’s look at an eight row table. The first thing we do is start on the rightmost input (sentence letter) column, and alternate T and F every one row. Like so:

Then we move one column to the left and alternate every two . Like this:

Finally, we move to the left again and alternate every four rows so that we double the amount we are alternating by each time we move to the left. Like this:

The result is a truth table that’s totally ready to solve: we have our columns labeled, our rows easily distinguishable, and most importantly we have all of the input side filled in in the standard way. This input side will be basically the same for every truth table. That is, you always follow this pattern:

Start below the rightmost atomic sentence letter and alternate every one row. Then move to the left one column and alternate going down every two rows. Finally, move to the left one last column and alternate going down that column every four rows. Extrapolate for bigger tables.

Solving a Truth Table

There are two methods to solving a truth table—filling in the right side or output side of the truth table. On the Brute Force method, you simply calculate each cell in the table by plugging the truth values of the sentence letters and working your way from the inside out. On the Intuitive method, you use your intuitive understanding of the operators to save yourself some work.

Brute Force Method

The Brute Force method to solving a truth table is simply to plug in the truth value for each individual letter by carrying them over from the input side and then calculating the truth value of each complex proposition given the truth values of the input side. This method is safer, so if you feel lost or lose confidence, then just revert back to the brute force method. It does have two downsides, though: a) It requires a lot more work and so takes more time, and b) it involves more individual steps and so the probably of making a simple mistake increases a bit. The second problem is probably balanced out by the riskiness of the Intuitive Method.

Write the truth value of each * letter * as it appears on the left side of the table under each letter as it appears on the right side

1. Starting with the “inner most” operators (inside the most parentheses) calculate the truth value of the whole relationship.

2. Then work your way out until you’ve calculated the truth value of the main operator.

So, step 0 is to make the truth table, as discussed in Basic Symbolization :

Step 1 is to fill in the truth values on the right side for the sentence letters

Step 2 is to find the inner most operators (the ones with the most amount of parentheses outside of them.

And then solve for those values using the truth tables that we use to define each operator.

Repeat the steps in 2 (working from “in” to “out”) until you’ve solved the truth value of the main operators (the operators that are only inside the outermost parentheses. Keep in mind that if there are no outermost parentheses, then they are implied).

So Q \(\leftrightarrow\) (P\(\vee\)Q) is actually supposed to read: (Q \(\leftrightarrow\) (P\(\vee\)Q))

Repeat in each row:

Then you analyze the results by looking at the right side of the truth table!

Intuitive Method

The Intuitive Method to solving a truth table uses our intuitive understanding of the logical operators and their individual truth tables to save as much work as possible. We can often eliminate half of our work for one column in a single swoop.

It is a faster method, but also increases the chances that we’ll make a mistake by moving too quickly and missing something or being overconfident and ignoring important details.

How does it work? Let’s start with an example and then we’ll look at how it works a bit more.

Starting with the first column, we ask ourselves what we know about the main operator (\(\rightarrow\)). I notice that there is only one atomic letter as the antecedent to this conditional. I know that if the antecedent to a conditional is false the whole conditional is true (since it’s only false when T\(\rightarrow\)F and if it’s F\(\rightarrow\)?, then it’s certainly not T\(\rightarrow\)F!). So I just need to find the rows on which P is false and I know the whole first formula will be True!

Then we solve the rest more-or-less using the Brute Force method. When is (Q\(\leftrightarrow\)P) true? When they’re the same! It’s false if they’re different.

Now we look at the second column. When is a conjunction true? It’s only true in one case: when both conjuncts are true. So if Q is false...? Then the whole second formula (Q \(\wedge\) \(\neg\)(P\(\vee\)Q)) is false.

Now that we know Q is true in the remaining cells of the second output column, we need to ask ourselves when \(\neg\)(P\(\vee\)Q) is true. It says “neither P nor Q are true”. When would that be true? Only when P and Q are both false (remember it would be equivalent to (\(\neg\)P\(\wedge\)\(\neg\)Q)). Are both false in rows 1 or 3? Nopers. So that means that \(\neg\)(P\(\vee\)Q) is false in both of our remaining rows. If just one conjunct is false, the whole conjunction is false. So:

Okay, final column. Another way of doing the intuitive method is simply to understand what the formula says in a more intuitive way than the logical formula. (Q \(\wedge\) \(\neg\)P) says something like “Q is true and P is false.” When we understand it this way, it’s easier to figure out on which row(s) it will be true: we’re looking for the row(s) where Q is true and P is false. Which row is that?

Now we just fill in the rest as false since we know that on those rows P isn’t true while Q is false.

Okay, now that we’ve gone through the intuitive method, let’s take a look at some of the rules which we can use while doing the intuitive method. Here are the rules I use:

  • Antecedent F or Consequent T \(\rightarrow\) Whole Implication T
  • One disjunct T \(\rightarrow\) Whole Disjunction T
  • One conjunct F \(\rightarrow\) Whole Conjunction F
  • So “(\(\neg\)P\(\bullet\)Q)” means “P is false and Q is true ”

These four rules can save you loads of time. Just identify which is simplest in a formula (surrounding the main operator ): an antecedent? A consequent? A disjunct? A conjunct? Then you identify when that element fits the intuitive rule and eliminate lots of work!

Analyzing a Truth Table

Classification.

If you’re being asked to analyze a single proposition using a truth table, then you automatically know that the answer will be one of three options. This is called a Classification problem because you’re classifying a single proposition.

1) Tautology/Tautologous : the column under the proposition is filled only with Ts.

A “Logical Truth” or Tautology is a statement that, regardless of how the world turns out to be, will be true. Think of “we’ll either have a democratic president or we won’t have a democratic president.” Even if the world ends and we have no president at all, that disjunctive statement is still true!

Example \(\PageIndex{1}\)

2) Self-Contradiction/Self-Contradictory : the column under the proposition is filled only with F’s.

A self-contradiction is always false no matter how the world ends up being. Think of the example “Kamala Harris is going to be our next president, but luckily we won’t have to have Kamala Harris as our next president.” It doesn’t matter what actually happens in the world, this statement will always be false. It can’t possibly be true because it contradicts itself.

Example \(\PageIndex{2}\)

3) Contingent : the column under the proposition is filled with a mixture of “T”s and “F”s.

Contingent propositions are true or false depending on how the world is. The simplest contingent propositions are atomic propositions, which either describe the world accurately or don’t describe the world accurately.

Example \(\PageIndex{3}\)

If you’re being asked to analyze a set of propositions using a truth table, then you know that the answer will be one of the following four options. This is called a Comparison problem because you’re comparing multiple propositions with one another in order to determine what logical relation holds between them. This is the most complex type of truth table analysis problem.

1) Logically Equivalent : each row of the output side of the truth table is the same on each column. So each row is a homogeneous set of either all T’s or all F’s. Logically equivalent propositions are true and false in exactly the same states of the world: they give the same output every time.

Example \(\PageIndex{4}\)

2) Contradictory : the truth values are the opposite on each row of the output side of the truth table. Notice that, since we only have two truth values (T and F), that means that we could only ever have contradictory pairs of statements. A set of three or more couldn’t possibly be contradictory.

Example \(\PageIndex{5}\)

3) Consistent : Once you’ve determined that a set of statements is neither contradictory nor logically equivalent, you should check to see whether it is consistent. A consistent set of propositions is one where the logical form or structure of those propositions allows them to all be true at the same time. If the world is a certain way, then all of the statements will end up being true. So when testing for consistency, you’re simply looking at the output side of the truth table for a row completely filled with T’s. If you find it, then that set is consistent: they can all be true at the same time.

Example \(\PageIndex{6}\)

4) Inconsistent : The last option is to call a set of propositions inconsistent. If you never find that row filled completely with T’s, then the propositions you are analyzing are inconsistent: they cannot, as a matter of logical structure, be true at the same time, no matter the state of the world.

Example \(\PageIndex{7}\)

Note that a set of logically equivalent propositions is likely to be consistent since it’s likely to have one line on which all propositions are true. Furthermore, every contradictory pair of propositions is inconsistent. For the sake of the class, instructors will often require that you choose only one o f the four options on a multiple-choice quiz or exam, though, so how do you decide?

Easy: just test for these four relations in order. Start by asking “are they logically equivalent?” Then, when you’ve found a line on which they have different truth values, as yourself “could they be contradictory?” Next, if you decide that they aren’t contradictory (or couldn’t be because it’s a set of 3 or more), go hunting for one row on which all of the columns have a T. If you find it, then the set is consistent. If you never find such a row, then the set is inconsistent.

In short : always answer with the strongest answer available. If a set of statements is both logically equivalent and consistent, then the correct answer on a multiple-choice test is “Logically Equivalent” since that’s a stronger claim (it’s a claim about every row rather than just one row).

Testing for Validity

If you’re being asked to analyze an argument or inference , then you know that there are only two possible answers: Valid and Invalid. It’s easier to start by discussing an invalid inference:

1) Invalid : an inference or argument is invalid if you find a row of the output side of the truth table where all of the premises are true, but the conclusion is false. So this, depending on how many propositions make up the argument, will typically be a row that looks like TTTTF, TTF, TF, TTTF, etc. If you find even just one row where the output side looks like this, then you’ve proven that the argument is invalid. I like to think of it as searching for a radioactive row. If you find one with *all* true premises and a false conclusion, then you’ve found a radioactive row and therefore you’ve found out that the argument is invalid.

fig-ch01_patchfile_01.jpg

I like to think of the counterexample line—the line that tells you the inference is invalid— as a sort of “radioactive” line you’re searching for. Think of it like this: you’ve got your Geiger counter and you’re scanning through the truth table for radiation. If you find a radioactive line, then the argument is bad (invalid). If you don’t find a radioactive line, then the argument is clean (valid).

2) Valid : If you look through the whole output side of a truth table and never see a row where *every* premise is true and the conclusion is false, then you’ve found a valid argument. Remember that rows where one premise is false don’t count and rows where it’s all false premises and a true conclusion don’t count. Everything is safe except rows somewhat similar to TTF or TTTTF or the like, depending on how many premises and conclusions there are.

Validity means it’s impossible that the premises would be true while the conclusion is false. So if you find a row (even just one!) in a truth table telling you that if the world is like this (what you see on the input side of the row) then the premises will be true while the conclusion is false. This would never happen with a valid argument, so it follows that the argument you’ve found is an invalid argument.

The Reverse Truth Table Method

What happens if we symbolize or translate an argument and we end up with 5, 6, or more atomic sentence letters? Are we doomed to create a truth table with 32, 64, or more rows? That would be a fate worse than many things!

Fear not, dear student. We have a method for directly testing the validity of an inference without having to build a complete truth table. It’s called the indirect or reverse truth table method. The basic idea is to assign “invalid” truth values to the premises and conclusions, figure out what we need the atomic sentence letters to be in order for those truth values to obtain, and then see if we can consistently apply truth values to the atomic sentence letters in order to create a counterexample (a line with all true premises and a false conclusion). So, basically, we’re looking for that “radioactive” row (TF, TTF, TTTF, etc.) from the complete truth table from the previous section, but instead of building a whole truth table, we’re simply going right to the end and testing if there could be such a row. We’re testing to see if a radioactive row is even possible.

So, let’s walk through an example, and then we’ll come up with a set of steps for doing the reverse truth table method. Here’s the English argument:

If you don’t pass the driver’s license written test, then you won’t have a driver's license (until you are able to pass it). But I don’t have a driver’s license, so that means I didn’t pass the driver’s license written test?

Consider how confusing this would be to someone who has passed the written portion of the test, but hasn’t yet completed the driving test. They did pass the written test, but still don’t have a driver’s license! This is confusing, because this argument is what is called “Affirming the Consequent”. It’s a formal fallacy, or an invalid argument that might appear to be valid. Let’s symbolize it (ignoring the parenthetical):

~W \(\rightarrow\) ~L

\(\therefore\) ~W

Remember that “\(\therefore\)” means “therefore"

In order to run the reverse truth table method of analysis on this argument, we’ll first want to set up as if we are doing the output side of a truth table. We’ll want to give ourselves lots of room to work with:

(~W \(\rightarrow\) ~L) / ~L // ~W

Now, the next thing to do is to assign truth values to each whole proposition so that we have a radioactive row or counterexample row. In this case, there are two premises and a conclusion, so the counterexample row will be TTF:

(~W \(\underset{\bf{T}}{\rightarrow}\) ~L) / ~\(\underset{\bf{T}}{L}\) // \(\underset{\bf{F}}{\sim W}\)

The setup part is done. Now we need to do the hard work: actually work out if we can consistently apply truth values to the atomic sentence letters. We’ll go one step at a time, starting with the conclusion. Why start with the conclusion? It's typically easier to make a sentence false given the truth tables for disjunction, negation, and conditional; so the conclusion generally is the easier place to start. Can you figure out why the rules for these operators make establishing falsehood easier?

(~W \(\underset{\text{T}}{\rightarrow}\) ~L) / ~\(\underset{\text{T}}{L}\) // \(\underset{\text{F}}{\sim \overset{\bf{T}}{W}}\)

The conclusion is false, so W will need to be true since the conclusion is ~W. So then we assign T to every W that appears in the argument:

(~\(\overset{\bf{T}}{W}\) \(\underset{\text{T}}{\rightarrow}\) ~L) / ~\(\underset{\text{T}}{L}\) // \(\underset{\text{F}}{\overset{\bf{F}}{\sim }\overset{\bf{T}}{W}}\)

And then work out how that affects the formulas containing the atomic letters I just assigned truth values to. In this case, if W is true, then ~W is false, and if an antecedent is false, then the whole conditional is true. The main operator is the conditional, and so no problems with the first premise:

(\(\overset{\bf{F}}{\sim }\overset{\bf{T}}{W}\overset{\bf{T}}{\underset{\text{T}}{\rightarrow}}\) ~L) / ~\(\underset{\text{T}}{L}\) // \(\underset{\text{F}}{\overset{\bf{F}}{\sim }\overset{\bf{T}}{W}}\)

What about the second premise? Well, since we haven’t yet been forced to assign a truth value to L, we can assign whatever we want to it, so we’ll make it false! The result will be that ~L is true.

(\(\overset{\bf{F}}{\sim }\overset{\bf{T}}{W}\overset{\bf{T}}{\underset{\text{T}}{\rightarrow}}\) ~L) / \(\overset{\bf{T}}{\sim }\underset{\text{T}}{\overset{\bf{F}}{L}}\) // \(\underset{\text{F}}{\overset{\bf{F}}{\sim }\overset{\bf{T}}{W}}\)

At this point, we’ve symbolized the argument, put it in a row as if we were going to make a truth table out of it, and then tried assigning truth values to the atomic sentence letters to make a radioactive row (in this case TTF). Since we were able to do so without coming across a contradiction, we know that the inference is invalid .

Here’s a sort of algorithm for the reverse truth table method of analysis:

1. Symbolize the inference and write out in a single row using slashes between formulas.

2. Assign truth values to each complete formula such that all premises are true and the conclusion is false.

3. Then calculate the truth value of each atomic sentence letter given the truth value of the whole formula. Start with the most restrictive formulas. If you run into a case where multiple truth values would work, then start a new line for each possibility and test each going forward.

4. Then transfer the atomic sentence letter truth value(s) to the other instances throughout the whole inference (transfer the truth values of, for example, ‘A’ to all other A’s throughout the formula).

5. Then calculate whether it is possible to continue assigning truth values to atomic sentence letters and transferring those values to other instances without running into a contradiction (that is, where a single letter must be both T and F) .

6. If you run into no contradiction , then the inference is invalid because the radioactive row is possible . The process is over.

If you run into a contradiction , then possibly the inference is valid. Complete all lines you’ve started to ensure that there is no possible consistent assignment of truth values. You only need one possible row where there is no contradiction to prove that the inference is invalid; whereas you need to eliminate every possible row that could have true premises and a false conclusion before you can know it’s valid.

Okay, let’s try a more complex one:

Lila: We’re either going home or I’m going home alone unless you both assure me you will drive us home later and will phone the babysitter.

Diego: I can’t assure you that I’ll be sober enough to drive us home later.

Lila: Well, I’m not going home alone and you’re not staying the night here.

Diego: Well, then, I guess either we’re both going home now or we’re getting a motel room.

If we conceive of the whole exchange as one big inference, we can symbolize it the following way:

(H \(\vee\) (I \(\vee\) (A \(\wedge\) P))) / ~A / (~I \(\wedge\) ~S) // (H \(\vee\) M)

Now, we assign truth values to the premises and conclusion so that they’ll be radioactive.

Here, I’ve gone ahead and put them under the main operators:

(H \(\underset{\bf{T}}{\vee}\) (I \(\vee\) (A \(\wedge\) P))) / \(\underset{\bf{T}}{\sim }\)A / (~I \(\underset{\bf{T}}{\wedge}\) ~S) // (H \(\underset{\bf{F}}{\vee}\) M)

Next, I’m going to start looking for some simpler formulas to assign truth values to. I’m eyeing the conclusion and the ~A. The conclusion will only admit of one set of truth values: both H and M must be false for the \(\vee\) to be false. A must be false for ~A to be true.

(H \(\underset{\text{T}}{\vee}\) (I \(\vee\) (A \(\wedge\) P))) / \(\underset{\bf{T}}{\sim }\underset{\text{F}}{A}\) / (~I \(\underset{\text{T}}{\wedge}\) ~S) // (\(\underset{\bf{F}}{H}\underset{\text{F}}{\vee} \underset{\bf{F}}{M}\))

Next, you transfer truth values from the ones you just assigned to all identical letters.

6.5.3.PNG

Notice how I’d need the right disjunct (I \(\vee\) (A \(\wedge\) P)) to be true for the first premise to turn out true.

But we already know that A is false from premise 2. So that means the conjunction (A \(\wedge\) P) must be false. Now that we know this, we must conclude that I is true for the first premise to turn out true. So we can conclude that I is true.

(\(\underset{\text{F}}{H} \underset{\text{T}}{\vee} (\underset{\bf{T}}{I} \vee (\underset{\text{F}}{A} \underset{\bf{F}}{\wedge}\) P))) / \(\underset{\text{T}}{\sim }\underset{\text{F}}{A}\) / (~I \(\underset{\text{T}}{\wedge}\) ~S) // (\(\underset{\text{F}}{H}\underset{\text{F}}{\vee} \underset{\text{F}}{M}\))

Now we transfer the new atomic truth value:

6.5.4.PNG

Now let’s try to work out that third premise. First we can process the negation on I:

(\(\underset{\text{F}}{H} \underset{\text{T}}{\vee} (\underset{\text{T}}{I} \vee (\underset{\text{F}}{A} \underset{\text{F}}{\wedge}\) P))) / \(\underset{\text{T}}{\sim }\underset{\text{F}}{A}\) / (\(\underset{\bf{F}}{\sim }\underset{\text{T}}{I} \underset{\text{T}}{\wedge}\) ~S) // (\(\underset{\text{F}}{H}\underset{\text{F}}{\vee} \underset{\text{F}}{M}\))

We’ve already got a contradiction!!! Ouch! ~I would have had to be true for (~I \(\wedge\) ~S) to turn out true. Both conjuncts need to be true. But ~I is false according to our assignment of false to I. Bummer dude!

What now? We’ve reached a contradiction! At this point we ask ourselves: “was there any step I made that I wasn’t forced to make?” In this case, no, we didn’t arbitrarily choose true or false for any letter, so every step we took was forced by logic. That means there aren’t any alternative assignments to consider and therefore the radioactive counterexample is impossible . Our inference is valid .

You could have also assigned values to I and S given the third premise, but I chose to finishing the first premise. Either way would’ve resulted in a contradiction. This is not an example of an alternative assignment . We’ll cover alternative assignments below:

Here's a handy flow chart for you:

6.5.2.PNG

Let’s try a quick example with alternative assignments possible:

(H \(\underset{\text{T}}{\vee}\) (I \(\vee\) (A \(\wedge\) P))) / \(\underset{\text{T}}{\sim }\)A / (H \(\underset{\text{T}}{\vee}\) M) // (~I \(\underset{\text{F}}{\wedge}\) ~S)

There are many ways for Premise 1 to be true, many ways for Premise 3 to be true, and many ways for the Conclusion to be false (an exception to the generalization I made earlier). So we’re not being forced as much as we were in the previous example. Arguments where the conclusion has more than one way of being false are typically the hardest arguments to do using the reverse truth table method. Hardest, that is, in terms of how much work is involved. Remember that this isn’t difficult in the sense of it being a complicated procedure. It’s not like playing chess. We could program a computer to do this whole method in an afternoon. It does, though, sometimes take a bit of work to work out the answer. Let’s start with what we are forced to do:

(H \(\underset{\text{T}}{\vee}\) (I \(\vee\) (\(\underset{\bf{F}}{A}\) \(\wedge\) P))) / \(\underset{\bf{T}}{\sim }\underset{\text{F}}{A}\) / (H \(\underset{\text{T}}{\vee}\) M) // (~I \(\underset{\text{F}}{\wedge}\) ~S)

A must be false because ~A is true. At this point, we aren’t forced to do anything for Premise 1 since there are still many ways for it to come out true. Nothing has changed for Premise 3 and the Conclusion. At this point we need to split our line into all possible successful assignments. I’m going to start with the conclusion (I chose this more or less arbitrarily). Here’s what I do:

\[\begin{array}{} &(H &\vee (I \vee (&A& \wedge P))) / &\sim &A / (H &\vee M) // (&\sim I& \wedge& \sim S) &&\\ \text{Possibility 1} & \rightarrow &T &F &F &T &F &T &\textbf{T} &F &\textbf{F}&&\\\text{Possibility 2} & \rightarrow &T &F &F &T &F &T &\textbf{F} &F &\textbf{T}&&\\\text{Possibility 3} & \rightarrow &T &F &F &T &F &T &\textbf{F} &F &\textbf{F} && \end{array} \nonumber\]

If you focus on the conclusion, it looks sort of like a truth table now, doesn’t it? Now we complete the process for each possible line:

\[\begin{array}{} (H &\vee (I \vee (&A& \wedge P))) / &\sim &A / (H &\vee M) // (&\sim &I& \wedge& \sim &S) \\& T &F &F &T &F &T &T&\textbf{F} &F &F&\textbf{T}\\ &T &F &F &T &F &T &F&\textbf{T} &F &T&\textbf{F}\\ &T &F &F &T &F &T &F&\textbf{T} &F &F&\textbf{T} \end{array} \nonumber\]

Now I’m transferring truth values:

6.5.5.PNG

And then calculate what needs to happen given the changes you’ve made. I’ve changed the first premise, so now I notice that (A \(\wedge\) P) is false and now I is false in my first row, so that means H must be true. In the other rows, nothing has yet forced me to assign anything to H.

\[\begin{array}{} (H &\vee (&I \vee (&A& \wedge P))) / &\sim &A / (H &\vee M) // (&\sim &I& \wedge& \sim &S) \\T& T &F &F &F &T &F &T&T &F &F&F &T\\ &T &T &F &F &T &F &F&F &T &F&T &F\\ &T &T &F &F &T &F &F&F &T &F&F &T \end{array} \nonumber\]

Then transfer that H:

6.5.6.PNG

At this point, nothing is forcing my hand. If you look carefully, you’ll see that there is no single letter on any single line that must be a given truth value for the truth values we’ve assigned the complete formulas to obtain. So now what???

If we are free to assign any truth values we want, then we aren’t going to run into any contradictions. It follows that there are no contradictions in these three rows and there is therefore at least one row where there are no contradictions. This inference is invalid .

All we need is one row where there are no contradictions to prove that the inference is invalid.

That’s all! That’s how we do the reverse truth table method. One can imagine how to use this to test even super complex sets of sentences for consistency: just assign “true” to each individual sentence and then look for a contradiction. One line with no contradiction? You’ve got a consistent set of sentences.

Macat_Logo_No-Strap_RGB_White.png

  • The Macat Team
  • Dec 3, 2019

10 examples of critical thinking that changed the world

It’s fair to say that Einstein was using critical thinking skills during the 10 years that it took him to create his Theory of General Relativity. Other physicists assumed that the differences in the ways that bodies fall were too small to be of significance, but Einstein—a 28-year-old clerk at a patent office—could see that these details deserved further investigation.

He had to come up with another, more creative, solution.

“Suddenly a thought struck me,” he recalled. “If a man falls freely, he would not feel his weight… This simple thought experiment… led me to the theory of gravity.”

From this he predicted the existence of gravitational waves, which control how every sun, planet, and object in our universe behaves.

In 2016, the LIGO collaboration proved him right: they announced their first direct detection of gravitational waves in “the scientific breakthrough of the century.” Professor Stephen Hawking said the discovery had “the potential to revolutionize astronomy.”

“Being bold enough to let your mind go where good arguments take you, even if it’s to places that make you feel uncomfortable, may lead you to discoveries about the world and yourself.” (Critical Thinking: The Art of Argument, by George W. Rainbolt and Sandra L. Dwyer)

Einstein’s Theory of General Relativity places him among the most influential nonconformists, mavericks, and free-thinkers in history. Charles Darwin might also spring to mind. Maybe Galileo, Marie Curie, or Simone de Beauvoir.

We know them as geniuses, eccentrics, independent spirits, or even rebels. But what they all have in common is the ability to think creatively and critically about the world, putting aside their peers’ ignorance or assumptions to see new connections in the most mundane situations and change our view of the universe. They are critical thinkers.

1. Albert Einstein

C.P. Snow put it best: “One of [Einstein’s] greatest intellectual gifts, in small matters as well as great, was to strip off the irrelevant frills from a problem.”

(From Einstein: The First Hundred Years )

If you take one critical thinking tip from Einstein, make it…

If something looks wrong, then it’s probably worth finding out why. Trust your own judgement based on the facts, not the assumptions of others, and look for a solution within the details.

2. Charles Darwin

Darwin’s ability to see new connections in mundane situations led him to map out a new theory—evolution—that changed the way we saw the world.

If you take one critical thinking tip from Darwin, make it…

Sometimes the most profound discoveries are hidden in seemingly unlikely places; look where others don’t, and enjoy the sense of discovery and excitement.

3. Galileo Galilei

Pioneering astronomer, philosopher, and—after his discoveries caused uproar in lazy thinkers within religious circles—“ defender of truth in the face of ignorance. ”

If you take one critical thinking tip from Galileo, make it…

Great critical thinkers evaluate arguments to see how they stand up, putting to one side the conclusions and assumptions of others—and filter for themselves what resonates as right or wrong.

4. Martin Luther King, Jr.

Inspired millions with his talent for argument; his “I have a dream” speech—a rallying cry for equal rights—still resonates 50 years on.

If you take one critical thinking tip from Martin Luther King Jr, make it…

Developing a strategy, organizing an argument, and learning the art of persuasion are the keys to changing the world.

5. Simone de Beauvoir

The most radical feminist thinker of the 20th century;  The Second Sex  was the first work to argue for equality that respected a woman’s individuality and voice.

If you take one critical thinking tip from Simone de Beauvoir, make it…

Don’t be afraid to think differently, even if that means challenging the basis of society itself.

6. Edwin Hubble

Discovered galaxies beyond the Milky Way—and proved that they were expanding—simply by gathering and analyzing more data than anyone else.

If you take one critical thinking tip from Edwin Hubble, make it…

Evidence, evidence, evidence. The more you have, and the more you can filter it to get to what’s really going on, the better your conclusion will be.

7. Marie Curie

Paved the way for x-rays and cancer treatment; her sense that pitchblende must include unknown radioactive elements led to the discovery of polonium and radium.

If you take one critical thinking tip from Marie Curie, make it…

Critical thinking is nothing to do with negativity or nitpicking. It’s about asking questions—the right questions. It’s about not accepting things on trust.

8. Sir Isaac Newton

Discovered universal gravitation “by thinking on it continually.” A genius known for a relentless passion for putting everything to rigorous test.

If you take one critical thinking tip from Sir Isaac Newton, make it…

Persistence in thinking and questioning the world around you is the key to more creative solutions where others see only masses of information.

9. Stanislav Petrov

Saved the world from a nuclear disaster during the Cold War; Petrov spotted a false computer report of an American missile strike and, trusting the facts at hand, halted a mistaken counter strike.

If you take one critical thinking tip from Stanislav Petrov, make it…

Form your own judgement based on the facts, and—once you’re sure of your ground—be willing to back it against all comers.

10. W. E. B. Du Bois

Inspired American civil rights movements by refusing to accept that some inequality could be exchanged for legal rights—a view held by other black intellectuals—and publishing his ideas in The Souls of Black Folk .

If you take one critical thinking tip from W. E. B. Du Bois, make it…

Critical thinking is important because it is what makes us adaptable, enables us to act independently, and allows us to move beyond what we already know or guess.

Can you suggest any other great critical thinkers or examples of great critical thinking? Let us know in the comments section below:

Recent Posts

How To Manage Talent Effectively?

How Can Learning Gains For Pupils Be Measured?

Stockholm’s advertising ban: what should be censored and who gets to decide?

Have a thesis expert improve your writing

Check your thesis for plagiarism in 10 minutes, generate your apa citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on 25 September 2022 by Eoghan Ryan .

Critical thinking is the ability to effectively analyse information and form a judgement.

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Instantly correct all language mistakes in your text

Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.

upload-your-document-ai-proofreader

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, frequently asked questions.

Critical thinking is important for making judgements about sources of information and forming your own arguments. It emphasises a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In an academic context, critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

The only proofreading tool specialized in correcting academic writing

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

example of consistency in critical thinking

Correct my document today

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyse the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words ‘sponsored content’ appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarise it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it a blog? A newspaper article?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Ryan, E. (2022, September 25). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved 15 April 2024, from https://www.scribbr.co.uk/working-sources/critical-thinking-meaning/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, how to paraphrase | step-by-step guide & examples, tertiary sources explained | quick guide & examples, how to quote | citing quotes in harvard & apa.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]

  Spring 1001 •   ENGLISH 110J • Professor Tanaka

  CT MODEL 1

 <CAUTION: These handouts must be used in conjunction with class discussion and other course materials. They are not intended to be stand-alone explanations.>

INTRODUCTION

    I’d like to briefly describe the set of models and concepts that you will be using as ‘tools’ to help you to ask meaningful questions about traditional grammar and try to understand a good   explanation of a grammatical problem might look.   Of course, this approach is not very ‘traditional.’   If you read through the survey of grammatical analysis in your handout, you would have seen that normative or prescriptive grammars have traditionally tried to make rules about correct or incorrect usage.   They have not been primarily designed as explanatory or even descriptive grammars.

    Our goal is not to generate expert, linguistic explanations of grammar but to develop a reasonable problem-solving and questioning attitude towards what ‘traditional grammars’ have to say about language.   I put the term in quotation marks because almost all current popular guides to English grammar as well as many college texts make many of the normative assumptions that grammarians have made for hundreds of years.   Hence, we don’t have to go to an ancient, out-of-print text to find a good example of the traditional approach to English grammar.

    So in this handout, I would like to simply give you the basic outlines of our critical thinking <CT> model.   I will leave the application of this model to our two text books for another time.

A. ARGUMENTS AND REASONS

1) An argument is an attempt to prove the truth or rationality of your beliefs or ideas to a specific audience. It is supposed to persuade your audience to accept or understand your point of view.

2) All arguments are directed towards a specific audience. You cannot 'argue' in a vacuum. There would be no point to it. It follows that to argue well you must know your audience. And knowing your audience means knowing what beliefs you have in common <these beliefs you need not prove> and what you do not <what you need to prove>.

3) In order to argue well, you need to know what you yourself believe. You must be self-aware. An argument is an attempt to get others to either share or accept your right to hold your beliefs. A good argument is always based on a set of assumptions that you and your audience have in common. In the context of constructing effective explanations, if you do not understand what you and audience do and do not share, your arguments will most likely fail.

4) An ancient philosopher said, ‘Know yourself and know your enemy and in a thousand battles win a thousand victories.’ This principle holds true in communications as well as warfare.

  

B. THE UNCERTAINTY OF TRUTH

1) In a public university environment, we are all supposed to share the same rules for critical thinking and argumentation. In other words, we should all use the same rules for giving reasons and explanations. And we should all share the same rules for deciding what counts as a good or reasonable argument and what does not.     It goes without saying that teachers and students should ideally have a common notion of what, for example, it means to call a statement ‘true’ or a ‘fact.’ To help establish this common model, critical thinking as a subject has been mandated at all levels of instruction, from kindergarten to our own GE requirements here at CSUS.

2) There are a lot of assumptions underlying this focus on critical thinking. But one that is central to our course of study is this: that what people refer to as ‘the truth’ is something that emerges from open discussion and open debate.     The reason for this openness in the search for truth is the basic belief that there is no one ‘true-for-all-time’ description of reality. We all live in a world that is uncertain and contingent, not so much because ‘reality’ itself is changing.   Rather, our concepts for understanding and interpreting that model are, for a complex of reasons, in a state of continual flux.     Now contrary to what some may want to believe, no one person, group or ideology has any special insight into an ‘always correct’ understanding of the world as it involves the participation of others who share different belief systems.

3) So all ‘truths’ are in a sense contingent upon the arguments that support them. For an idea or belief to be held as ‘true’ or at least ‘reasonable,’ it must meet the critical thinking criteria shared by members of a particular group.   Obviously, for those who do not share those assumptions, that idea or belief may be interpreted differently.     In the remainder of this discussion, I’ll be assuming that we all, as faculty and students, agree that there is a common model that we share. This common model is not actually stated in any one specific place, but if you pick up any university text book or listen to most university lectures, you will find a very similar set of underlying assumptions.

4) For example, most texts and instructors would assume that before we agree that a statement, P, about a particular subject, X, were true or reasonable, we would be obligated to listen to or consider the primary   arguments for and against P before making up our minds. We can't simply close out options and possibilities simply because they do not agree with our own personal beliefs and opinions, especially if we admit to knowing next to nothing about X.

5) The point I have just made may seem self-evident.   But in the real world, many of us seem to be ‘naturally inclined’ to accept the truth of what we already believe and ignore other interpretations, even when we can’t offer any logically valid explanations to support our own beliefs.

6) One of the primary arguments for why a university education is necessary for many of us is that the discipline of the university environment forces us to take account of who we are, what we believe and why.   Here, we are all expected to live and work in a world where reasons and explanations are expected, if not demanded, of everyone, students and teachers alike.

C. AN OUTLINE OF AN ARGUMENT

 1) Even if you have not taken a course specifically devoted to critical thinking in the past, you have all learned out to construct an argument from your writing courses. So I am going to review the basic structure of an argument by reviewing the fundamentals of writing an argumentative essay.

 2) Regardless of what textbook you used in English 1A or its equivalent, you were all probably taught to structure an essay in something like the following fashion.

     a. you have a thesis statement. It is a statement that can be proven to be true or false, reasonable or unreasonable for a specific audience. The intended result is that after reading your essay this audience will agree that your thesis is true or that it is reasonable. Your teacher probably told you to always be aware of your audience when constructing your arguments.   That is excellent advice, since the selection of your audience is the most critical single choice a writer makes.

     b. your paper consists of a series of supporting arguments   each of which is supported by specific examples and other evidence.   The relationship between your thesis and your supporting arguments should be such that if your audience accepts your supporting arguments, they must also accept your thesis.

 3) So constructing an argument is rather simple if you have the right tools and your are clear in your presentation. Other things being equal, there are only two keys to developing a successful argument. First, your argument itself must follow the rules of logic.   You must make consistent statements and valid inference.     Second, you must have sufficient evidence to persuade your audience to accept your thesis statement.

 4) Now arguments are usually often when the truth of a statement, P, about a subject, X, is unsure or not clear.   And usually, an argument is judged in comparison to other arguments. For example, you may be arguing that X is P while your college is arguing that X is not-P.

 So if you and the other person(s) are both using the rules of logic properly, then the better or best argument is usually determined by one thing: the evidence.   Other things being equal, the best argument is the one that can provide the best evidence or support for its thesis.

  D. THE RULES OF LOGIC

1) OK, let’s talk about I want to talk about the rules of logic and critical thinking   here. The most important concept that you will use in most of your university course is the principle of consistency. Very simply, it says that a good argument does not contain contradictions, where a contradiction is defined as the assertion that a specific statement is both true and not-true.

2) Consistency or the lack of it can be seen in a number of different ways.   For example, you may be inconsistent in your use of definitions or in the ideas that your argument presupposes.   You might also be inconsistent through negligence.   You fail to present key evidence or ignore issues that are logically necessary to your case. 

3) Even though I am taking great liberties with traditional logic here, I want to define two basic forms of consistency, what I will call INTERNAL and EXTERNAL consistency.

4) Other things being equal, an argument is internally consistent if it does not contradict itself. In other words, if, in the context of an argument, you claim that ‘X IS Y’ and maintain that position throughout your argument, then, in respect to that statement, you are being consistent. <Again, in critical thinking classes, what we are talking about would be called logical validity.>

     a. However, if, in another place in your   argument, you state that it is not the case that ‘X IS Y’ <or ‘X IS not-Y’ or some variation on that theme>, then your argument is INTERNALLY INCONSISTENT. This is not good.

     b. For example, given the criteria as they are now defined, you can’t give Cate Blanchet an Academy Award for the Best Actress and the Best Supporting Actress for the same role in the same movie.  

     c. The point about internal consistency has nothing to do with facts about the world but with the logical shape or structure of your argument. If an argument is inconsistent, then it cannot lead to an acceptable result, regardless of what may or may not be true of the real world.

     d. So the first order of business when checking an argument, whether it is your own or someone else's, is to look for internal inconsistencies. If it contains internal inconsistencies, then the argument is dead.

     e. Related to the issue of internal consistency are the basic RULES OF INFERENCE.   These rules limit what conclusions can be drawn from what kinds of evidence.   For example, the familiar syllogism offers what is probably the most common inference schema.

          Premise:            All X’s are Y.          Fact:                  P is an X.          Conclusion:          Therefore, P is a Y.

 If the premise is true and the fact-statement is true, then the conclusion   has to be true.   Of course, the premise might be false in the real world,   but that would not change the fact that this is a good inference.  

     f. Here are a few deductions that are not permitted by the rules of inference.

           Premise:            All X’s are Y.           Fact:                  P is a Y           Conclusion:          Therefore, P is an X.

          Fact:                   P is an Y.          Conclusion:           Therefore, all X’s are Y.

           Fact:                   P is an X.          Conclusion:           Therefore, P is a Y.

      g. Even though these forms of arguments are not well-formed they are very common in every day life, particularly in mass media communications.

              

E. EXTERNAL CONSISTENCY

1) Up to this point, we have only talked about the internal relationships between the elements in an argument.   We were assuming the truth of the premises and fact statements.

2) But once we decide that the form of an argument is internally consistent, that it does not contradict itself or make unwarranted inferences,   then we can turn our attention to the truth of the evidence used in the argument.   EXTERNAL CONSISTENCY relates to what you and your audience agree is true of the real world. Unlike internal consistency, external consistency is a relative notion, since it can and does vary from audience to audience.

3) This is one of the major reasons why the selection of audience is so critical in any communication situation.

4) All arguments in every day life are based upon a set of givens <beliefs about the world> that you and your audience tacitly agree to share. These are the shared facts, beliefs or assumptions that are not under discussion or dispute, so they can be used in your argument to prove to your audience what you want to say. Without some set of assumed givens, communication let alone argumentation would not be possible.

5) An EXTERNAL INCONSISTENCY is generated when you make an assertion that contradicts something that you and your audience have already agreed to.

6) Of course, you and your audience can and may disagree without anyone being called inconsistent. In fact, disagreement underlies the whole purpose of argumentation to begin with. It is simply that once you agree to take certain facts, beliefs or assumptions as true, as givens, you can't go back on that agreement just to fit your argumentative needs. That is not disagreement. That is being inconsistent.

7) The key feature of external inconsistencies is that they have their foundation in beliefs you and your audience share about a world that is external to or outside the form of your argument. In fact, the form of your argument may be perfectly consistent from an internal point of view. In other words, you are being externally inconsistent when you contradict assumptions that you had already, explicitly or implicitly, agreed to.

8) The issue of external consistency is one of the points that distinguishes a ‘reasonable’ from an ‘expert’ interpretation or argument.   Even if I may not have an expert knowledge of the facts in a specific area, but I produce an interpretation that is both internally consistent and also consistent with my actual beliefs regarding the facts of a situation, then my interpretation or argument can be seen as ‘reasonable.’

  The bottom line is that you should always pay close attention to internal consistency when constructing an argument or when critiquing the arguments of others. For if an argument has internal problems, whether of consistency or definition, the argument simply will not work, regardless of what other virtues it may have.

 Of course, in constructing this overview, I have made some sweeping generalizations and taken great liberties with some very complex issues, but as a working model, this sketch has proven to be very useful for making quick but fairly accurate analyses and evaluations ‘on the fly.’

 The important thing is not to bet bogged down in terminology.   I feel that most students at this level of instruction have already internalized the basic features of good thinking so they can intuitively feel or sense when something is wrong with an argument, or, as is often the case, something important is missing. So my advice is, other things being equal, to trust your instincts.

Structured Thinking: Bringing Consistency to Problem Management

  • IT and Communications Get to the root cause of problems, restore major incidents faster, and permanently address IT stability challenges. Learn more

We often tout the virtues of intuitive thinking as a powerful tool when coming up with creative solutions. However, having a hammer doesn’t make every problem a nail. Often times, thinking creatively to problem solve a complex, high-stakes technical issue can be downright damaging. Structured (or convergent) thinking can organize a flood of information to reveal suspicious gaps in data, bringing efficiency and speed to problem investigation. Using a consistent approach focused on critical data coupled with a visual representation of this thinking tremendously improves communication and collaboration. Besides significantly advancing the search for true root cause.

Here is an example of information that was recorded in a typical, freeform field:

DPM application is not available at both DC & HQ sides, but 80% or more of the data has not updated for 0 to 4 hours at 1 or more DC’s IMOD was not informed about this issue until after 4 hours elapsed. Informed Incident Manager for Distribution.

Here is the same information using a structured thinking approach:

example of consistency in critical thinking

Which one is easier to read and to analyze? More importantly, which format tells you what we know AND what we need to know?

In a freeform text field we naturally tend to do one of two things: a) We input as much information as possible to fill the blank space—whether the information is relevant or not—to show all the work that was done; b) We leave it blank. This is not only a problem for proper root cause analysis (RCA), but it is a communication problem for transferring information, for trend analysis and for linking problems back to incidents and vice versa.

The format shown above describes a portion of the ITIL ® -recognized Kepner-Tregoe approach we call “problem specification”. The problem statement (at the top) and eight buckets of IS/IS NOT information allow us to see, at a glance, what we know and what we don’t know. It drives critical thinking, provides structure and facilitates communication.

The importance of a quality problem statement cannot be understated (and it’s not the same as the case title). In our work with global support organizations, we have found that getting the problem statement consistently right will reduce the average time to resolve by 18% or more. A quality problem statement is pivotal to everything that follows in the RCA process.

Consider these italicized problem statements of varying quality:

  • Server issues – This has an unclear object and potentially addresses multiple issues.
  • Database server down – This is an improvement, but we can be more specific. Which server is down? And do we already know why it is down? We want to continue to ask why until we don’t know the answer anymore.
  • Inventory Database Server Is out of storage space – This statement further specifies the problem and sets us up for a focused root cause analysis. The statement is specific, involves a single object and defect, and cause is unknown. This is where our problem analysis can begin.

A note about documentation:

Documentation is nobody’s favorite activity, but it is important. None of us like to spend a lot of time documenting, but without it, there is no reusable knowledge for our colleagues in the organization, no audit trail and no organizational learning. The important piece is to focus on the most critical information. As far as knowledge management software is concerned, the key is to model the tool after the process… and not the other way around.

A commitment to structured, clear thinking brings quality results that drive critical IT stability and produce real cost savings as we stop solving the same problems over and over and over again.

Blog Image 1

We are experts in:

For inquiries, details, or a proposal!

Subscribe to the KT Newsletter

example of consistency in critical thinking

12 Examples of Consistency

Examples of Consistency in the Workplace

C onsistency is a crucial factor when it comes to succeeding in the workplace. It is one of the most distinguishing factors between those who make it and those who do not. Successful people usually believe that if you do something successfully once, you can continue replicating that success with similar actions. It is the idea that to succeed,  you must be consistent in your approach and actions — meaning having a plan and sticking to it, following it through, and achieving the desired results repeatedly. But what exactly is consistency, and why is it so important?

What does consistency mean?

Consistency means doing the same thing over and over again — or it means applying the same standard or rule to all. This can be a good or bad thing, depending on its context.

For example, if you are always late for work, your boss will start to expect and dread your lateness. But if you consistently do your job well, your boss will appreciate your reliability.

Another excellent example of consistency is if someone is consistently reliable, they are always dependable, and their actions, words, or behavior are always the same.

If something is consistent in terms of being equal, all aspects are the same size, quantity, or degree.

Lastly, if someone is consistent in sticking to a decision or plan, they can follow through with what they have said they will do and do not change their mind often.

In general, consistency in the workplace is seen as a positive attribute because it shows that you are reliable. And those are qualities that people respect and admire.

Why is consistency important?

Consistency is important because it makes people believe you will always behave in a certain way. It builds trust between people and enables them to predict how you will behave in the future.

People appreciate consistency because it makes life easier — they know what to expect from you and can plan accordingly.

Overall, consistency creates structure and order in the workplace, which reduces confusion and provides clarity on expectations.

When there are consistent standards across the workplace, it boosts overall performance — leading to long-term success for everyone involved.

If everyone knows their roles and responsibilities, they can work together more efficiently towards a common goal. But how do you exhibit consistency in the workplace? Here are 12 examples of consistency:

1. Coming into work every day on time

An excellent example of consistency is someone who comes to work every day on time and is ready to start.

The person sends a message to their boss and colleagues that they are dependable and can be counted on to get the job done.

It also shows that they take their work seriously and are eager to put in the effort required to be successful.

Coming in late or unprepared sends the message that you must take your job seriously or be committed to your responsibilities. It can also disrupt workflow and set a negative example for others.

2. Being proactive and showing initiative

Nothing frustrates a boss like an employee who waits to be told what to do. Not only does this make the boss’s job more complex, but it also means that the employee needs to take ownership of their work.

That signals a lack of interest or enthusiasm. Worse yet, it can come across as unprofessional and lazy.

On the other hand, being proactive and showing initiative every day in the workplace will make your boss happy. This is because it shows that you are willing to take ownership of your work and are proactive and independent.

And it also makes your boss’s job easier since they do not have to give directions or oversee everything you constantly do.

Read also: 11 Good Examples of Showing Initiative

3. Following work procedures and rules

Sticking to your work procedures and rules can be difficult when they seem rigid and restrictive. But remember that following them consistently enables you to do your job well and makes it possible for others to do their jobs too.

Of course, there are times when it is necessary to be flexible, but those instances should be the exception, not the rule.

And if you need to deviate from the procedures, you must communicate with your manager and team, so they know what is happening. That way, everyone remains on the same page, and no one gets confused or caught off guard.

Doing things the right way is essential to maintaining order in the workplace. When people do not follow procedures and rules, it creates chaos and can lead to disastrous consequences.

By being consistent in your daily behavior, you help create an orderly and productive work environment.

4. Treating everyone with respect

Imagine if someone looked down on you simply because they were in a higher position than you. It would feel pretty crappy.

Unfortunately, some people believe that it is right to treat those who are below them in rank with disrespect. But here is the thing — treating others with respect is the right thing to do, regardless of their position within the company.

It does not matter if you are the boss, team leader , or employee — all people deserve the same respect. Even if you are having a bad day, maintain a civil attitude toward those around you.

Nothing ruins work morale more than someone who is consistently negative and disrespectful. Instead of letting anger or stress get the best of you, remind yourself that everyone deserves to be treated with basic human decency.

5. Sticking to your deadlines

It can be tempting to push back a deadline when it seems complicated to meet but to be someone reliable and consistent – would be best if you stick to your deadlines.

Pushing back a deadline makes you look unreliable or incapable of meeting commitments , which can breed mistrust among your colleagues. Worse yet, if this becomes a regular habit, it can ruin your reputation and career.

Instead of pushing back deadlines, try to find ways to make them more manageable. Ask for help from others on your team, brainstorm solutions, and set realistic milestones.

6. Not taking shortcuts at the expense of quality

It may be tempting to cut corners or take the easy way out, but this is often a recipe for failure. So, always do things the right way the first time around. This may seem like common sense, but it is something that many people forget.

When you try to take shortcuts or cut corners, you can save time in the short term, but you may end up having to go back and fix things later, which will take more time overall — the results are usually not as good as if you had done it the right way in the first place.

And if you are cutting corners on quality or accuracy, you will only end up hurting yourself in the long run.

So, do not be tempted to take shortcuts — it is always best to do things right from start to finish.

Also read: 17 Examples of Authenticity

7. Following through on commitments

Being someone who follows through on commitments–whether big or small–is a critical component of professional and personal success.

When you commit yourself to do something, people are counting on you to deliver — and by doing so, you prove that you can be trusted. This builds credibility and creates a foundation of trust that can be relied upon in the future.

Furthermore, following through on commitments instills a sense of discipline in your life. It shows that you will do the job, even if it requires extra effort. This is an important characteristic to have, especially when things get tough.

8. Putting in the same effort each day

One good example of consistency at work is an employee who puts in the same effort each day, regardless of how they feel. This is important because it sets a precedent for others on the team and shows that you are reliable and can be counted on.

While it is normal to have moments where you do not feel motivated or drained, you must push through those feelings and continue to put in the same effort.

That will help maintain your productivity and consistency over time, which can be beneficial at work and in other areas.

9. Acting with integrity in all situations

Another excellent example of consistency is to act with integrity in all situations. This means being honest and truthful, even when no one is looking.

It is doing the right thing, even when it is challenging. It is making choices that reflect your values and beliefs.

One of the best ways to act with integrity is to develop a robust personal code of ethics–a set of standards that you live by every day.

When you know what you stand for, making decisions that align with your beliefs is easier. And if you ever face a difficult choice, asking yourself, “what would my code of ethics say?” can help you make the right decision.

People who act with integrity in one situation will usually do so in all cases because their behavior is based on their values and not on what might be easiest or most advantageous.

10. Cultivating a good attitude

Another example of consistency at work is when you cultivate a good attitude. When you come to work with a positive attitude , you are more likely to be productive and have a good day.

People will try to avoid you if you go to work in a bad mood. No one wants to work with someone who is negative and constantly complains. On the other hand, if you maintain a good attitude all the time, others will be drawn to you.

A positive attitude is contagious and can positively impact the entire workplace. Other people will be more cooperative and helpful, making everyone’s job easier. They will appreciate your positive outlook and be more likely to want to work with you.

So, never change your attitude or demeanor, even if something goes wrong or someone is difficult.

Keep a positive outlook and maintain a good attitude during difficult times or when things do not go your way.

Read more: 18 Examples of Exceeding Expectations

11. Behaving professionally in all situations

Gossiping or complaining behaviors can reflect poorly on you and make you seem less competent or qualified for the job.

To uphold high standards of professionalism, you must live up to that reputation by always conducting yourself respectfully and fairly.

If you remain professional, people will take you more seriously. They will rely on you to act with integrity and respect, even in challenging situations.

People will trust you to represent them or their organization in a positive light, no matter what the situation may be.

Finally, behaving professionally demonstrates that you respect yourself and others. It shows that you are committed to doing the right thing, even when it is not easy or inconvenient.

Overall, it is essential to remember that how we behave affects not only our personal lives but also our professional lives.

To be successful and respected in the workplace, we must act professionally at all times — that is what consistency means.

12. Doing your job to the best of your abilities

Set and maintain the standards for your job — if you always do your best, it establishes a pattern for others to follow.

In addition, doing your job to the best of your abilities shows that you take pride in your work and are committed to providing quality work . This can earn you respect from your colleagues and manager.

Also, being consistent at work can help you develop new skills. When you constantly challenge yourself to do your best, you learn new ways to complete tasks and solve problems.

It can make you a more valuable employee and increase your chances of being rewarded with a pay rise or promotion.

And doing your job consistently well can make you feel good about yourself. It provides a sense of accomplishment and makes you feel like you are contributing to something larger than yourself.

Conclusion:

Exhibiting consistency in the workplace is essential for success — not only for yourself but also for your colleagues and team leaders.

It builds trust between peers, improves performance within teams and departments, and produces measurable results that keep everyone focused on reaching organizational goals.

Whether you are a manager or a team member, striving for consistency at work can lead to more incredible professional and personal achievements.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Notify me of follow-up comments by email.

Notify me of new posts by email.

The effects of using collaborative digital storytelling on academic achievement and skill development in biology education

  • Open access
  • Published: 17 April 2024

Cite this article

You have full access to this open access article

  • Sinan Bilici   ORCID: orcid.org/0000-0002-0610-2126 1 , 3 &
  • Rabia Meryem Yilmaz   ORCID: orcid.org/0000-0002-0453-1357 2  

The purpose of the study is to investigate the effect of the use of digital storytelling on academic achievement, critical thinking dispositions, co-regulation, and narrative skills of 10th grade students. To this end, the study was conducted using a semi-experimental design with a convenience sample. The participants consisted of 64 students (33 in experimental and 31 in control group) who were studying in a high school. After the groups were trained, a two-week pilot study was conducted by forming collaborative groups among the students. This was followed by eight weeks of main implementation, during which students presented their projects to the class every two weeks. Following the digital story presentations in the experimental group, feedback was provided by the course instructor and peers. In addition, rubric scores were generated by the researchers for each digital story. Academic achievement test, critical thinking disposition scale, co-regulatory skills scale, and digital story evaluation rubric were used as data collection tools at the end of the process. Independent samples t-test, repeated ANOVA, and regression analysis were performed on the collected data. According to the results, digital story activities had moderate positive effects on students’ academic achievement and critical thinking, and high positive effects on co-regulation. In addition, the narrative skills of the students in the experimental group increased significantly over the weeks with a difference of 27.44 points. There was also evidence that storytelling ability was a significant predictor of academic achievement and that this ability increased significantly over the weeks. The results showed that the collaborative creation of a digital story by the students had a positive effect on their academic achievement and the development of their skills.

Avoid common mistakes on your manuscript.

1 Introduction

With the technological explosion of communication and globalization, there is a shift from traditional understandings of literacy to exploring different forms of meaning-making. In this direction, it has been noted that today’s students, referred to as Generation Z, use information technologies to create information, transform data into information and share it, and also learn in different ways compared to previous generations (Malita & Martin, 2010 ; Toki & Pange, 2014 ). These students, who have grown up with digital technologies, prefer multimedia content that is rich in visual and auditory terms to content that is mainly textual (France & Wakefield, 2011 ). Therefore, it is becoming increasingly important to use contemporary learning methods to attract students’ attention (Ohler, 2006 ; Smeda et al., 2014 ). Appropriate teaching approaches supported by contemporary technologies and original teaching methods that create the desired skills for students come to the fore as a need in this sense (Seferoğlu, 2015 ). Digital storytelling, which is considered as one of these teaching methods, has emerged as a result of the combination of today’s transformative technologies and traditional stories (Sadik, 2008 ; Yang & Wu, 2012 ).

In this study, the digital storytelling method was used, which is an innovative pedagogical method that attracts the attention of today’s youth who tend to use technology (Smeda et al., 2014 ). Potential positive aspects such as digital storytelling providing a student-centered, fun and interactive collaborative environment (Chan et al., 2017 ; Çetin, 2021 ; Lantz et al., 2020 ), encouraging critical thinking in the product design process (Hung et al., 2012 ; Malita & Martin, 2010 ), improving narrative skills through efforts to create an original scenario (Dogan & Robin, 2008 ; Foley, 2013 ), and improving learning performance as a result of active interaction (Figg & McCartney, 2010 ) guided this study. Digital storytelling makes this study important because it is a student-centered innovation that combines the power of both traditional storytelling and technology, and its use in education has grown in recent years. The integration of the digital storytelling method, which is economical and easy to implement, into learning environments, especially with the help of existing technologies in the field of education, will have an important place in students’ acquisition of many 21st century skills (Yuksel-Arslan et al., 2016 ). In this regard, it is believed that examining academic success variables along with 21st century skills such as critical thinking dispositions, co-regulation, and narrative skills that are expected of today’s students provides a holistic and broad perspective to the study.

2 Theoratical framework

2.1 digital storytelling.

Digital stories are powerful learning and teaching tools that combine traditional storytelling skills with digital components such as text, images, sound recordings, music, and video (Robin, 2016 ). Digital stories revolve around a chosen topic and often have a specific point of view, similar to traditional storytelling. Digital stories consist of personal perspective, interesting question, emotional content, sound effects, musical power, economy, and pacing (Bull & Kajder, 2004 ; Robin, 2006 ). Although there are different types in the literature covering many disciplines at different educational levels, it is possible to divide the most common types into three categories in terms of content: personal, historical, and didactic stories (Robin, 2008 ).

Digital storytelling, which is a student-centered and constructivist approach, is seen as an educational technology and literacy learning tool that uses almost all the skills expected of 21st-century students (Dogan & Robin, 2009 ; Lantz et al., 2020 ; Yuksel-Arslan et al., 2016 ). It is often mentioned in the literature that it provides a strong foundation for 21st-century literacy, such as digital, global, technological, media, visual, and information literacy (Chan et al., 2017 ; Çetin, 2021 ; Di-Blass et al., 2009 ; Robin, 2008 ; Xu et al., 2011 , Yang & Wu, 2012 ). Due to the potential impact of digital storytelling on skill development, the current study focuses on critical thinking dispositions, narrative skills, and co-regulatory skills in addition to academic achievement.

It is stated that when students collaborate, the learning process can become more interesting and enjoyable despite the repetitive nature of the learning process (Laal & Ghodsi, 2012 ). At this point, it is seen that digital storytelling comes to the forefront as an effective collaborative tool in learning environments. Students who participate in digital storytelling activities perform a dual function of learning and having fun together (Toki & Pange, 2014 ). It is argued that in almost all processes of digital storytelling, from the idea stage to the sharing of products, it often creates an environment for collaboration, communication, and interaction among students (Nam, 2017 ; Ming et al., 2014 ). Technology becomes the focus in the background as students work together to develop their projects. Thus, the process also provides an opportunity to interact with the content and each other while creating digital stories (Lantz et al., 2020 ). When digital stories are created in a collaborative environment, students can take on different roles such as designers, listeners, commentators, readers, writers, communicators, artists, and thinkers (Bull & Kajder, 2004 ). Within the group, students can actively exchange ideas and give and receive feedback. Sharing and evaluating digital stories among peers also allows students to express themselves, talk critically with each other, develop tolerance, and take responsibility (Hung et al., 2012 ; Malita & Martin, 2010 ). Their efforts to synthesize the information they have gathered about the topic into an original scenario also contribute to the development of narrative skills. On the other hand, it is argued that the process of cooperation and communication within the group is effective for students to build the content together, provides more meaningful learning and supports their academic success (Figg & McCartney, 2010 ; Jenkins & Lonsdale, 2007 ).

2.2 Critical thinking and digital storytelling

Critical thinking, defined as a judgment process that guides problem solving and decision making, has two dimensions: ability and disposition. While critical thinking skill is the ability to think critically easily and skillfully with mental effort, critical thinking disposition is seen as the desire, sense of responsibility, and attitude necessary for a person to think critically (Facione, 1990 ). Because it is a factor that affects performance in all areas of social life, the development and promotion of students’ critical thinking skills is considered one of the main goals of today’s educational process (Facione, 2011 ; Giancarlo & Facione, 2001 ).

The literature emphasizes that digital storytelling has an important place in promoting critical thinking (Lampert, 2007 ) and students’ critical reflection on what they have learned (Robin, 2016 ). The digital storytelling process provides students with opportunities to think critically, from identifying topics to sharing, inspiring, encouraging thinking, creativity, interaction, reflecting on their knowledge, and problem solving (Jenkins & Lonsdale, 2007 ; Ohler, 2006 ; Robin, 2008 ; Xu et al., 2011 ). In the process of creating a digital story, students have the freedom to be critical in the selection of content that will support their story in a meaningful way (Chan et al., 2017 ; Czarnecki, 2009 ). As a contemporary, student-centered pedagogy, this study suggests that digital storytelling can be effectively integrated into the learning environment to enhance students’ critical thinking dispositions.

2.3 Cooperative learning and digital storytelling

Collaborative learning is defined as an interactive process in which authority and responsibility are shared among group members and all members are united around a common goal (Laal & Ghodsi, 2012 ; Tezci & Perkmen, 2016 ). During collaborative learning, the organization of activities takes place at different levels of social interaction: individual, pair, and group levels (Hadwin & Oshige, 2011 ). Co-regulation at the group level is expressed as a dynamic regulation process and interaction that coordinates the self-regulation processes between two or more peers in the learning process (Didonato, 2013 ).

Digital storytelling is known to be a powerful method and collaborative tool that promotes classroom collaboration and student knowledge construction (Boase, 2008 ; Hung et al., 2012 ; Yuksel et al., 2011 ). When students are asked to create their own digital stories, either individually or as members of a small group, it has been found that the greatest benefits of digital storytelling can be realized and that team building, cooperation, and other interpersonal skills can be improved (Reinders, 2011 ; Sadik, 2008 ). It is argued that students who create digital stories in a collaborative learning environment improve their communication skills, learn to ask questions, and express their ideas more easily (Hafner & Miller, 2011 ; Malita & Martin, 2010 ; Wang & Zhan, 2010 ). In this direction, it is believed that the digital story activities implemented in the current study will facilitate students’ acquisition of the collaborative skills required today.

2.4 Narrative skill and digital storytelling

When individuals construct stories, many cognitive and linguistic skills play a role in their writing processes (Bumgarner, 2012 ; Ohler, 2013 ). Therefore, narrative skill is seen as a complex product creation process that requires a high level of thinking and interaction in the human mind (Karadağ & Maden, 2013 ; Özbay & Barutçu, 2013 ). The many benefits of storytelling are highlighted, allowing listeners to effortlessly assimilate information and incorporate it into their existing schemas (Csikar & Stefaniak, 2018 ). Although narrative skill plays an important role in the transfer of information and its transformation into gains in learning processes, this skill does not develop spontaneously (Temizkan, 2011 ).

It is argued that the use of various tools and techniques offered by modern technologies, such as digital storytelling, provides important opportunities to improve narrative skills in this sense (Bumgarner, 2012 ; Campbell, 2012 ; Dogan & Robin, 2008 ; Foley, 2013 ; Ohler, 2013 ; Oskoz & Elola, 2014 ). Digital storytelling helps students to manage and understand their story writing processes (Yamaç & Ulusoy, 2016 ) and positively affects their narrative skills, ideas, organization, and sentence fluency (Ohler, 2006 ; Sylvester & Greenidge, 2009 ). Particular emphasis has been placed on the impact of scriptwriting, which is considered the first and most important step in the digital storytelling process, on narrative skills (Ohler, 2006 ; Robin, 2008 ; Xu et al., 2011 ). In this research, it is believed that with the effective integration of digital storytelling into the learning environment, students will increase their academic achievement and improve their narrative skills.

2.5 Significance of the study

Its potential to mobilize and develop 21st century skills (Smeda et al., 2014 ; Wang & Zhan, 2010 ) has made digital storytelling the focus of the current study. In the literature, the pedagogical effects of digital storytelling on students’ academic achievement (Figg & McCartney, 2010 ; Yang & Wu, 2012 ), collaboration (Hung et al., 2012 ; Yuksel et al., 2011 ), attitudes (Sadik, 2008 ; Smeda et al., 2014 ; Yang & Wu, 2012 ), motivation (Chan et al., 2017 ; Di-Blas et al., 2009 ), critical thinking (Czarnecki, 2009 ; Yang & Wu, 2012 ), active learning (Boase, 2008 ; Ohler, 2006 ; Xu et al., 2011 ), writing skills (Oskoz & Elola, 2014 ; Tanrıkulu, 2021 ), communication (Malita & Martin, 2010 ; Sarıca & Usluel, 2016a ), problem solving (Abdel-Hack & Halwa, 2014; Yang & Wu, 2012 ), creativity (Bedir-Erişti, 2016 ; Nordmark & Milrad, 2012 ), reflection (Kim & Li, 2021 ), interest (Ivala et al., 2013 ), social learning (Ming et al., 2014 ; Robin, 2006 ), and deep learning (Barber, 2016 ) were examined. Foreign language teaching comes to the fore as a discipline of study (Fu et al., 2021 ; Hafner & Miller, 2011 ; Ming et al., 2014 ).

The current study was conducted in a high school biology course. It is known that due to the high cognitive load and the excess of scientific concepts and principles, students encounter difficulties in science-based lessons and have difficulties in understanding and remembering the concepts taught (Condy et al., 2012 ; Csikar & Stefaniak, 2018 ). From this perspective, the process of creating a digital story has the potential to improve learning as a result of students’ active interaction with the content, their groupmates, and the teacher, and is well suited for group work (Figg & McCartney, 2010 ; Jenkins & Lonsdale, 2007 ). It was considered important to examine the variables of co-regulatory skills and academic achievement. In addition, although there are digital story-oriented studies in secondary school science education (Çiçek, 2018 ; Dewi et al., 2018 ; Hung et al., 2012 ) and university biology education (Frisch & Saunders, 2008 ; Karakoyun & Yapıcı, 2016 ) in the literature, no study was found to investigate the effect of the digital story, especially in high school biology education. However, although critical thinking is considered as an important educational goal (Facione, 2011 ), there are few studies in the literature on the effect of digital storytelling on critical thinking, and current studies focused on the skill dimension of critical thinking (Csikar & Stefaniak, 2018 ; Yang & Wu, 2012 ). The fact that the high school period coincides with the age range of 12–18 years, when thinking skills mature, makes critical thinking education important during this period (Erdem & Genç, 2015 ). In the current study, investigating the effect of digital storytelling on the critical thinking dispositions of high school students is considered valuable in this sense. On the other hand, the prominence of digital storytelling as a powerful approach that can develop narrative skills by initiating a high level of interaction and thought process in the minds of individuals (Abdel-Hack & Helwa, 2014 ; Ohler, 2006 ; Sylvester & Greenidge, 2009 ) has guided the variable preferences in this study. The story scenarios that high school students construct during digital story activities are believed to activate many cognitive skills and enhance their academic performance. In this direction, the current study aimed to examine the effect of digital storytelling on 10th grade students’ academic achievement, critical thinking dispositions, co-regulation, and narrative skills. To this end, answers to the following sub-problems were sought:

10th grade students in the experimental and control groups

Is there a significant difference in levels of academic achievement?

Is there a significant difference in levels of critical thinking disposition?

Is there a significant difference in levels of co-regulation skills?

Is there a significant difference between the narrative skills of the experimental group students according to the weeks?

Is there a correlation between academic achievement, critical thinking disposition, co-regulation, and narrative skill levels of students in the experimental group?

3.1 Research design

The study used a quasi-experimental design with a pretest-posttest equivalent control group. The quasi-experimental design is often used in educational and psychological studies due to the difficulty in determining unbiased samples (Büyüköztürk et al., 2013 ; McMillan & Schumacher, 2010 ). In the quasi-experimental design with paired pretest-posttest control groups, two of the prepared groups are attempted to be equal on certain variables. Then, the equal groups are randomly assigned to the treatment groups and the experimental and control groups are determined. Equivalence is tested by applying pre-tests to the study groups, then the implementation process begins. At the end of the process, post-tests are conducted and the results are compared (Creswell, 2012 ; McMillan & Schumacher, 2010 ). Although the inability to impartially assign the participants seems to be the main problem of this design, the use of pre-tests for the qualifications to be examined in the groups makes the design useful and appropriate. In the current study, since there is no specific grading system among the classes in the selected school, the academic achievement pre-test was conducted to all branches of the 10th grade before the implementation process. The Academic Achievement Test was conducted as a pretest to determine if the students were equal in terms of academic achievement. The two classes with the closest pre-test mean scores were assigned as the experimental and control groups (10/A class control, 10/D class experimental group). At the end of the experimental process, post-tests were administered to the groups. Figure  1 illustrates the paired quasi-experimental design preferred in the study. The fact that the sample group was in the same school as the researcher facilitated communication and coordination with the students and the biology teacher. In addition, the researcher was able to quickly intervene in technical problems that arose in the computer science class.

figure 1

Quasi-experimental design in this study

3.2 Sample group

The study group of the research was selected from the 10th grade students of a high school using the convenience sampling method, one of the non-random sampling methods. In order to determine the equivalence of the groups, the academic achievement pre-test was conducted to the branches. The two classes with the closest pre-test mean scores were assigned as the experimental and control groups. One-factor ANOVA analyses were conducted on the pre-tests of academic achievement, and it was determined that the groups were equivalent to each other ( p  > .05). In the current study, the class size of the experimental group was 33 and the class size of the control group was 31, for a total of 64 students. Demographic information about the sample group is presented in Table  1 .

The demographic characteristics of the sample group were collected using an information form prepared by the researchers. It can be seen that most of the participants have mobile devices such as tablets and smartphones, and a significant number of them do not have their own computer. Only 5 of the students had created a digital story before this study. 34 students indicated that they could use computers at an intermediate level and 16 students stated that they could use computers at a good level.

3.3 Implementation process

The implementation process of the study, including the administration of pre-tests, training of groups, pilot implementation, main implementation, and post-tests, took a total of 13 weeks. Figure  2 summarizes the stages of the experimental implementation process by week.

figure 2

Experimental implementation process by week

Lectures were given by the same teacher in both experimental and control groups. Before the implementation, the groups were informed in the first week and the training plans were made. Four-hour (2 + 2) training sessions were given to the experimental group to create a digital story and to the control group to create a PowerPoint presentation. After the training, the students in both the experimental and control groups were divided into 7 groups. The decision of who would be in the groups was left to the students and they were divided into groups of 4–5 people with their friends whom they thought could work in harmony with each other. After the preparation and planning process, a two-week pilot implementation was conducted to test the system and identify problems before the main implementation. In the pilot implementation, small groups of students in the experimental group were asked to create digital stories by distributing the topics and developing solutions to the problems they encountered. After the pilot implementation, another 8-week implementation was carried out. The topics addressed in the experimental and control groups during the implementations are shown in Fig.  3 . The Cell Division unit, in which digital stories were created during the implementation process, includes the subtopics of Mitosis and Asexual Reproduction and Meiosis and Sexual Reproduction. The related topics cover a period of 10 weeks in the curriculum (Ministry of Education, 2018 ). Due to the length of time required for this unit, only this unit was focused on in the study.

figure 3

The implementation process flow in the experimental and control groups

In this process, which was carried out in collaboration with the students, the researchers mostly followed the story development processes, provided guidance where needed, and ensured data collection. The researchers were actively involved in all processes of group determination, pre-testing, training of experimental and control groups, implementation, and post-testing.

3.4 Data collection tools

The Critical Thinking Disposition Scale, the Co-regulation Skills Scale, and the Digital Story Evaluation Rubric were taken from sources in the literature, and the Academic Achievement Test was developed by the researchers. The relationship between the data collection instruments and the research questions is shown in Table  2 .

3.4.1 Academic achievement test

First, the objectives and outcomes related to the “cell division” unit in the current 10th grade biology curriculum of the Ministry of National Education were identified to determine the behaviors to be measured in the academic achievement test. Then, a pool of 36 multiple-choice questions was created in the first stage at different levels according to Bloom’s cognitive taxonomy to cover the acquisitions. Based on expert opinion, 18 additional questions were added to the pool of questions for the test form, and the levels of some questions were changed. To ensure the content validity of the test, specification tables were created before and after the item analysis. For the construct and content validity of the prepared achievement test, opinions were obtained from an assessment and evaluation specialist, a biology faculty member, a Computer and Instructional Technologies Education (CEIT) faculty member, and two biology teachers. The control form of the test was conducted to a total of 121 students in the 11th grade who had been exposed to the same unit of study the previous year.

In evaluating the item difficulty index and item discrimination index, the values specified by Turgut and Baykul ( 2015 ) were taken as a reference. As a result of the analyses performed on the test form, 25 items with item discrimination index (r) less than 0.30 were excluded from the test. 29 items with item discrimination index of 0.30 and above and item difficulty between 0.27 and 0.73 were included in the final test. The calculated mean difficulty index of the final test was 0.50, and the mean discrimination index was 0.38. These data show that the final test is at the average level of difficulty and can discriminate between those who know and those who do not know at a good level. The KR-20 value, which indicates the internal reliability of the test, was calculated to be 0.82.

3.4.2 Critical thinking disposition scale

The UF/EMI (University of Florida Engagement, Maturity, and Innovativeness Critical Thinking Disposition Instrument) critical thinking disposition scale used in this study was developed by researchers at the University of Florida and adapted into Turkish by Kılıç and Şen ( 2014 ). It is a five-point Likert-type scale consisting of three sub-dimensions; there are 11 items in the engagement sub-dimension, 7 items in the cognitive maturity sub-dimension, and 7 items in the innovativeness sub-dimension. It has been reported that the scale was tested by applying it to 342 students studying in the 9th and 10th grades of secondary education for the validity and reliability study. By applying confirmatory factor analysis for construct validity, X 2 /sd ratio was calculated as 2.99 (813.66/272) and RMSEA = 0.08. The Cronbach’s alpha internal consistency coefficient was calculated as 0.91 for the total scale, 0.88 for the engagement sub-dimension, 0.70 for the cognitive maturity sub-dimension, and 0.73 for the innovativeness sub-dimension. In the current study, Cronbach’s alpha internal consistency coefficients were obtained as 0.89 for the total scale, 0.87 for the engagement sub-dimension, 0.68 for the cognitive maturity sub-dimension, and 0.65 for the innovativeness sub-dimension.

3.4.3 Co-regulation skill scale

The co-regulation skill scale used in the study was developed by DiDonato ( 2013 ) and adapted into Turkish by Pan and Tanrıseven ( 2016 ). Before it was used in this study, the scale was conducted to three 10th grade students to determine if the scale was appropriate for use with high school students. The students indicated that the scale was clear and easy to understand. After receiving a detailed evaluation from the students and consulting the opinions of two experts in the field, the scale was applied. This scale consists of 19 items and measures students’ behaviors related to cooperative organization of the learning process. It was stated that the scale, which is a 4-point Likert type, was applied to 100 pre-service teachers for validity and reliability study. The researchers conducted confirmatory factor analysis to test the validity of the scale, which has a single factor structure. As a result of the confirmatory factor analysis, the fit indices of the model were found to be RMSEA = 0.074 (> 0.05); NFI = 0.94 (> 0.90); CFI = 0.95 (> 0.95); it has been reported to be detected as AGFI = 0.87 and GFI = 0.91. The factor loads of the scale items ranged from 0.26 to 0.70, and the root mean square error (RMSEA) was calculated to be 0.068. The Cronbach alpha internal consistency coefficient calculated to determine the reliability of the scale was 0.89. In the current study, the Cronbach alpha internal consistency coefficient was calculated to be 0.83.

3.4.4 Digital storytelling evaluation rubric

This measurement tool, developed by Sarıca and Usluel ( 2016b ), consists of a total of 30 criteria, 8 for the story section, 4 for the storyboard section, and 18 for the digital story section. It was reported that the created rubric was presented to the opinion of five experts working on digital stories and two experts in the field of measurement and evaluation, and weighted kappa coefficients were calculated by two independent raters for reliability. According to the results they obtained, it was stated that all the criteria of the story, storyboard and digital story sections showed a significant and good level of agreement.

3.5 Data analysis

The analysis types that meet each research question are given in Table  3 .

The analysis of the variables of academic achievement, critical thinking disposition, co-regulation, and narrative skills revealed no missing data or extreme values in the dataset. The normality analyses indicated that the data had a normal distribution. The kurtosis and skewness values of all variables in question were determined by Tabachnick et al. ( 2007 ) to be between + 1.50 and − 1.50. Analysis of variance assumptions were tested and it was found that the variances were homogeneously distributed ( p  > .05).

4.1 Differences between academic achievement levels

The results of the independent group t-test analysis, which was used to determine whether there was a significant difference between the academic achievement post-test scores of the groups, are presented in Table  4 .

According to Table  4 , the post-test academic achievement mean of the experimental group students ( M =  76.82, SD =  13.72) was significantly higher than the mean of the control group students ( M =  68.35, SD =  16.68) ( t (62)= -2.224, p  < .05)

4.2 Differences between critical thinking disposition levels

The results of the independent group t-test analysis, which was used to determine whether there was a significant difference between the groups’ post-test scores on critical thinking dispositions, are presented in Table  5 .

According to Table  5 , a significant difference was found between the experimental and control groups in the engagement factor ( t (62)= -2.190, p  < .05), cognitive maturity factor ( t (62)= -3.736, p  < .001) and total scale scores ( t (62)= -2.830, p  < .05) in favor of the experimental group. In the innovativeness factor, although there was no statistically significant difference between the experimental and control groups ( t (62)= -1.631, p  > .05), the mean of the experimental group ( M  = 29.18, SD  = 3.66) was higher than the mean of the control group ( M  = 27.74, SD  = 3.37) was higher.

4.3 Differences between co-regulation skill levels

The results of the independent group t-test analysis, which was used to determine whether there was a significant difference between the groups’ post-test scores on co-regulationskills, are presented in Table  6 .

According to Table  6 , the post-test mean of the experimental group students’ co-regulation skills ( M  = 66.15, SD  = 4.94) was significantly higher than the control group students’ mean ( M  = 59.00, SD  = 6.73) ( t (62)= -4.862, p  < .001)

4.4 Change of narrative skill levels by week

The results of the one-factor repeated ANOVA test showing the change in narrative skill scores according to the weeks in the experimental group are presented in Table  7 .

According to the findings in Table  7 , there was a statistically significant difference between the students’ digital story scores by week [ F (2.142, 68.551) = 847.214, p  < .001]. The change in the mean score of narrative skill over time is shown in Fig.  4

figure 4

Increase in narrative skill scores by week

4.5 The relationship between academic achievement and other variables

The results of the multiple regression analysis performed to determine the power of the variables in predicting the academic achievement of the students in the experimental group are presented in Table  8 .

Mahalanobis distance values were examined in the dataset of the study for multiple regression analysis, and it was found that all values were less than the critical x 2 table value ( D 2  < 18.47, p  > .001) for three independent variables, were normally distributed, and did not contain extreme values. According to Seçer ( 2015 ), there should be at least 15 participants for each predictor variable. In this study, the number of participants is 64. The values of Durbin-Watson (1.783), tolerance (0.773), and VIF (1.293) show that there is no multilinear problem in the analysis. When examining the data in Table  8 , the regression model that was established was statistically significant ( F  = 6.185; p  < .05). These three variables together explain 39% of the total variance in the academic achievement post-test. When the t-test results were analyzed for significance of the standardized regression coefficient ( β ), it was found that only narrative skill was a significant predictor of Academic Achievement Post-test scores ( β  = 0.353; t  = 2.140; p  < .05). The relative order of importance of the predictor variables on the academic achievement posttest: narrative skills, critical thinking disposition, and co-regulatory skills. The mathematical regression model that emerged within the conditions of this study regarding the prediction of the academic achievement post-test is “Academic achievement=-73.889 + 0.296 narrative skills”.

5 Discussions and implications

This study aimed to examine the effect of using digital storytelling on 10th grade students’ academic achievement, critical thinking disposition, co-regulation, and narrative skills. In the study, students in the experimental group actively interacted with their teachers, friends, and subject content throughout the collaborative digital storytelling process, produced creative products that reflected their perspectives, and became the heroes of their own stories. The study, which was conducted in such an environment where high levels of participation and motivation are effective, had important findings that are a contribution to the literature.

According to the results, students in the experimental group who created digital stories were more successful at the end of the process than students in the control group. This result shows parallelism with other studies in the literature (Çiçek, 2018 ; Figg & McCartney, 2010 ; Foley, 2013 ; Gömleksiz & Pullu, 2017 ; Hung et al., 2012 ; Korucu, 2020 ; Robin, 2006; Yang & Wu, 2012 ). The obtained results can be primarily explained by the constructivist environmental features revealed by the collaborative digital story creation activities. It is believed that the constructivist environment features such as cooperation, active participation, and interaction during the digital story creation process among the students in the experimental group positively affected the learning outcomes. It is known that digital stories are a teaching tool that supports learning, encourages cooperation, improves creativity and decision making, and enables students to actively participate in the learning process and learn from each other (Balaman-Uçar, 2016 ; Dogan & Robin, 2008 ; Robin, 2006 ; Smeda et al., 2014 ). It is argued that active interaction and communication with the content, peers, and teachers throughout the digital storytelling process provides students with more meaningful learning and supports their achievement (Figg & McCartney, 2010 ; Jenkins & Lonsdale, 2007 ). On the other hand, the fact that students who create digital stories create meaning with their perspective and add comments to their products (Malita & Martin, 2010 ; Robin & McNeil, 2012 ; Sadik, 2008 ; Yuksel-Arslan et al., 2016 ) may have affected their achievement by allowing them to take in more information.

The positive effect of digital storytelling on academic achievement can also be related to its multi-sensory and information embodiment aspect. In fact, it is emphasized that the flexible and dynamic nature of digital storytelling uses many cognitive processes by activating the senses. It can activate students’ visual and auditory senses in different ways than printed textbooks and by integrating visual images with written text. It also improves and accelerates student comprehension (Dreon et al., 2011 ; Nordmark & Milrad, 2012 ; Sadik, 2008 ). It has been stated that digital stories can be used to transform soft information into concrete information and make difficult concepts more understandable, because some abstract information may be difficult for students to understand due to their cognitive abilities (Ohler, 2013 ; Robin, 2008 ; Yuksel-Arslan, 2016). On the other hand, the fact that digital storytelling facilitates the recall and retention of information, especially with its effect on memory, may be related to the increase of students’ academic success (Bromberg et al., 2013 ; Csikar & Stefaniak, 2018 ; Di-Blas et al., 2009 ; Sarıca & Usluel, 2016b ; Wang & Zhan, 2010 ). It is believed that students’ active participation in the collaborative process to improve their products and the repetition of similar processes in each product contribute positively to students’ performance by supporting recall and permanence (Balaman-Uçar, 2016 ).

According to the results obtained in the study, the digital storytelling process significantly increased the critical thinking disposition of the students. This result can be explained primarily by the desire and motivation (Giancarlo & Facione, 2001 ) generated by the digital storytelling process. It is stated that the relaxed atmosphere and lively environment created by digital storytelling encourages students to interact, talk, and discuss critically with each other more than traditional methods (Karami et al., 2012 ). It contributes significantly to the development of critical thinking, which is a desirable educational outcome for students, and to critically reflect on what they have learned (Jenkins & Lonsdale, 2007 ; Lampert, 2007 ; Robin, 2016 ). Students critically consider multiple perspectives when researching and selecting multimedia content that meaningfully supports their stories and ideas, and when deciding what information to include to convey their message (Chan et al., 2017 ; Czarnecki, 2009 ; Kulla-Abbott, 2006 ). It is expected that the presence of students with different abilities and different views will create a diversity of ideas. In this sense, the process of digital storytelling with the group can be seen as a process in which students see, accept, and respect each other’s differences. In the current study, it is believed that students’ constructive criticism of their peers’ ideas and products and feedback on group activities (Balaman-Uçar, 2016 ; Wang & Zhan, 2010 ) also contribute to increasing their critical thinking disposition. Although there was no significant difference between the groups in the " innovativeness " dimension of the critical thinking disposition scale, the results in favor of the experimental group in terms of mean scores support the view that students with high innovative tendencies try to learn new information by researching, reading, questioning, and acting selectively thanks to their curiosity and impulses (Kılıç & Şen, 2014 ). From an educational perspective, it is emphasized that the stages of the digital story creation process are highly related to the transferable and applicable skills of critical thinking in innovative individuals, such as idea formation, selection, comparison, inference, organization, and review (Boase, 2008 ; Jenkins & Lonsdale, 2007 ; Lantz et al., 2020 ).

Another finding was that the co-regulation skills scores were significantly higher in the experimental group that created the digital story. In this sense, some studies in the literature (Hafner & Miller, 2011 ; Ming et al., 2014 ; Ohler, 2013 ; Robin, 2006 ; Smeda et al., 2014 ; Wang & Zhan, 2010 ; Yuksel et al., 2011 ) have found that storytelling leads students to communicate more with each other, teaches them to work with the group, prepares an environment for cooperation, and encourages them to work together to achieve certain goals. It is known that due to the nature of digital storytelling, it provides more opportunities for collaborative learning activities, communication and interaction between group members at almost every stage, from brainstorming ideas to the sharing step (Balaman-Uçar, 2016 ; Nam, 2017 ; Ming et al., 2014 ). It has been reported that thanks to this interaction provided by group work, students participate more in learning processes, knowledge construction is shared among group members, they have the opportunity to work more together, their responsibility skills develop, and they help each other more thanks to the responsibility they take on (Hung et al., 2012 ; Karakoyun & Yapıcı, 2016 ; Sadik, 2008 ; Smeda et al., 2014 ; Yang & Wu, 2012 ).

The study concluded that the narrative skills of students in the experimental group increased significantly over the weeks. Consistent with this result, various studies (Balaman-Uçar, 2016 ; Campbell, 2012 ; Dogan & Robin, 2008 ; Foley, 2013 ; Girmen et al., 2019 ; Yamaç & Ulusoy, 2016 ) have found that narrative skills improved over time in the digital storytelling process and that more skilled products were observed. By consistently organizing thoughts (Ohler, 2006 ; Oskoz & Elola, 2014 ; Sylvester & Greenidge, 2010), digital storytelling activates students’ writing skills (Abdel-Hack & Helwa, 2014 ) and improves ideas, organization, and sentence fluency. It has been stated that it has a positive effect on IT and provides fluency in story writing (Yamaç & Ulusoy, 2016 ). However, it has been argued that a well thought out and well written script will make the digital story more effective and successful (Dogan & Robin, 2008 ; Ohler, 2006 ; Robin, 2008 ). In this regard, it is believed that especially the personal perspective and comments on the digital story narration make it interesting, give students a chance to make their voices or comments heard, and students adopt their stories more easily (Reinders, 2011 ; Robin & McNeil, 2012 ; Xu et al., 2011 ).

The regression results show that the variables in the model explain 39% of the dependent variable. The remaining 61% can be attributed to other factors. This result may be due to the small sample size and can be expressed as a limitation of the study. However, in future studies, researchers can strengthen the model by adding new variables to the model in addition to the independent variables used in this study. According to the model, the relationship between narrative skills and academic achievement can be explained by students’ efforts to create effective and original narratives by combining the information they have researched on the topic in a scenario during the digital storytelling process. This is because it is emphasized that storytelling skills play an important role in transferring the information learned in the learning-teaching process and transforming it into an outcome (Temizkan, 2011 ). While the creation of a story scenario is a complex skill that involves the processes of continuous thinking, organizing, rethinking, and rearranging (Abdel-Hack & Helwa, 2014 ), it is stated that the iterative cycle of this process has a positive impact on student achievement (Balaman-Uçar, 2016 ). In particular, it can be said that the continuous peer feedback throughout the process (Kearney, 2009 ; Kulla-Abbott, 2006 ; Robin, 2016 ) allows students to see and develop their story constructions.

6 Conclusion

This study found that digital storytelling in a collaborative environment had a positive effect on high school students’ academic achievement, critical thinking, co-regulation, and narrative skills. It also found that narrative skills were effective on academic achievement and that these skills developed throughout the digital storytelling process. In order to keep the students interested and engaged in the process, two weeks were allocated for the pilot implementation and eight weeks for the main implementation. During this process, several limitations were encountered. The first is that the scope of the study was limited to the Cell Division unit, since the main implementation period was eight weeks in total. The second is that some technical problems were encountered in the process due to the lack of modern computer infrastructure in the Information Technology classroom of the implementation school. In addition, Microsoft (MS) Photostory 3 software was preferred for creating digital stories because of its ease of setup and use. Other mobile and online applications with animation creation and video editing features could not be preferred due to the lack of mobile devices in the entire working group and the lack of technical infrastructure. Some suggestions that could be beneficial in line with the results of the study are as follows:

Given the significant impact of collaborative digital storytelling on academic performance, critical thinking, collaborative regulation, and narrative skills, incorporating digital story activities into the classroom may help high school students develop these skills.

Students can be encouraged to use digital stories as a tool when preparing their homework and projects.

The current study was conducted in a 10th grade biology class. An interdisciplinary study can compare data from different courses with numerical and verbal content to see which students are more successful or interested.

The consistency of digital storytelling with other topics in biology courses can be examined with more comprehensive studies that include different biology topics.

Unlike MS Photo Story 3 software, depending on the educational level of the sample and their ability to use information technology, other desktop, mobile and online tools can be used.

Studies can be conducted to determine the difference between the processes of creating digital stories individually and in groups.

Abdel-Hack, E. M., & Helwa, H. S. A. A. (2014). Using digital storytelling and weblogs instruction to enhance EFL narrative writing and critical thinking skills among EFL majors at faculty of education. Educational Research , 5 (1), 8–41. https://doi.org/10.14303/er.2014.011 .

Article   Google Scholar  

Balaman-Uçar, S. (2016). Dijital öykülemenin İngilizceyi yabancı dil olarak öğrenen öğrencilerin yazma becerilerine olan etkisi (tez No. 435230) [Doktora Tezi . Hacettepe Üniversitesi-Ankara].Yükseköğretim Kurulu Ulusal Tez Merkezi.

Barber, J. F. (2016). Digital storytelling: New opportunities for humanities scholarship and pedagogy. CogentArts & Humanities , 3 (1), 1181037. https://doi.org/10.1080/23311983.2016.1181037 .

Bedir-Erişti, S. D. (2016). Participatory design-based digital storytelling and creativity indicators in elementary school. Turkish Online Journal of Qualitative Inquiry , 7 (4), 462–492. https://doi.org/10.17569/tojqi.28031 .

Boase, C. (2008). Digital storytelling for reflection and engagement: A study of the uses and potential of digital storytelling Report Produced as Part of the Phase 1 of The Higher Education Academy/JISC Higher Education e-Learning Pathfinde Programme. https://gjamissen.files.wordpress.com/2013/05/boase_assessment . pdf.

Bromberg, N. R., Techatassanasoontorn, A. A., & Andrade, A. D. (2013). Engaging students: Digital storytelling in information systems learning. Pacific Asia Journal of the Association for Information Systems , 5 (1), 2. https://doi.org/10.17705/1pais.05101 .

Bull, G., & Kajder, S. (2004). Digital storytelling in the language arts classroom. Learning & Leading with Technology , 32 (4), 46–49.

Google Scholar  

Bumgarner, B. L. (2012). Digital storytelling in writing: A case study of student-teacher attitudes toward teaching with technology (Publication No. 3533990) [Doctoral dissertation, University of Missouri-Columbia). ProQuest Dissertations and Theses Global.

Büyüköztürk, Ş., Kılıç-Çakmak, E., Akgün, Ö. E., Karadeniz, Ş., & Demirel, F. (2013). Bilimsel araştırma yöntemleri (15. Baskı). Pegem Yayıncılık.

Campbell, T. A. (2012). Digital storytelling in an elementary classroom: Going beyond entertainment. Procedia-Social and Behavioral Sciences, 69 (2012), 385–393. https://doi.org/10.1016/j.sbspro.2012.11.424 .

Çetin, E. (2021). Digital storytelling in teacher education and its effect on the digital literacy of pre-service teachers. Thinking Skills and Creativity , 39 , 100760. https://doi.org/10.1016/j.tsc.2020.100760 .

Chan, B. S., Churchill, D., & Chiu, T. K. (2017). Digital literacy learning in higher education through digital storytelling approach. Journal of International Education Research , 13 (1), 1–16. https://doi.org/10.19030/jier.v13i1.9907 .

Çiçek, M. (2018). Investigating the effects of digital storytelling use in sixth-grade science course: a mixed-method research study (Publication No.507290) [Doctoral dissertation, Middle East Technical University-Ankara]. Yükseköğretim Kurulu Ulusal Tez Merkezi.

Condy, J., Chigona, A., Gachago, D., Ivala, E., & Chigona, A. (2012). Pre-service students’ perceptions and experiences of digital storytelling in diverse classrooms. Turkish Online Journal of Educational Technology-TOJET , 11 (3), 278–285.

Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating quantitative and qualitative research (4th ed.) [DX Reader version]. http://basu.nahad.ir/ .

Csikar, E., & Stefaniak, J. E. (2018). The utility of storytelling strategies in the biology classroom. Contemporary Educational Technology , 9 (1), 42–60.

Czarnecki, K. (2009). Storytelling in context. Library Technology Reports , 45 (7), 5.

Dewi, N. R., Kannapiran, S., & Wibowo, S. W. A. (2018). Development of digital storytelling-based science teaching materials to improve students’ metacognitive ability. Jurnal Pendidikan IPA Indonesia , 7 (1), 16–24. https://doi.org/10.15294/jpii.v7i1.12718 .

Di-Blas, N., Garzotto, F., Paolini, P., & Sabiescu, A. (2009, December). Digital storytelling as a whole-class learning activity: Lessons from a three-years project. In Joint International Conference on Interactive Digital Storytelling (pp. 14–25).

DiDonato, N. C. (2013). Effective self-and co-regulation in collaborative learning groups: An analysis of how students regulate problem solving of authentic interdisciplinary tasks. Instructional Science , 41 (1), 25–47. https://doi.org/10.1007/s11251-012-9206-9 .

Dogan, B., & Robin, B. (2008, March). Implementation of digital storytelling in the classroom by teachers trained in a digital storytelling workshop. In Society for Information Technology & Teacher Education International Conference (pp. 902–907).

Dogan, B., & Robin, B. (2009, March). Educational uses of digital storytelling: Creating digital storytelling contests for K-12 students and teachers. In Society for Information Technology & Teacher Education International Conference (pp. 633–638).

Dreon, O., Kerper, R. M., & Landis, J. (2011). Digital storytelling: A tool for teaching and learning in the YouTube generation. Middle School Journal , 42 (5), 4–10. https://doi.org/10.1080/00940771.2011.11461777 .

Erdem, A. R., & Genç, G. (2015). Lise öğrencilerinin problem çözme becerileri ile eleştirel düşünme becerileri arasındaki ilişki. OPUS Uluslararası Toplum Araştırmaları Dergisi , 5 (8), 32–44.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . American Philosophical Association , 3. https://files.eric.ed.gov/fulltext/ED315423.pdf .

Facione, P. A. (2011). Critical thinking: What it is and why it counts. Insight Assessment , 2007 (1), 1–23.

Figg, C., & McCartney, R. (2010). Impacting academic achievement with student learners teaching digital storytelling to others: The ATTTCSE digital video project. Contemporary Issues in Technology and Teacher Education , 10 (1), 38–79.

Foley, L. M. (2013). Digital storytelling in primary-grade classrooms (Publication No. 3560250) [Doctoral dissertation, Arizona State University-Tucson]. ProQuest Dissertations & Theses Global.

France, D., & Wakefield, K. (2011). How to produce a digital story. Journal of Geography in Higher Education , 35 (4), 617–623. https://doi.org/10.1080/03098265.2011.560658 .

Frisch, J. K., & Saunders, G. (2008). Using stories in an introductory college biology course. Journal of Biological Education , 42 (4), 164–169. https://doi.org/10.1080/00219266.2008.9656135 .

Fu, J. S., Yang, S. H., & Yeh, H. C. (2021). Exploring the impacts of digital storytelling on English as a foreign language learners’ speaking competence. Journal of Research on Technology in Education , 1–16. https://doi.org/10.1080/15391523.2021.1911008 .

Giancarlo, C. A., & Facione, P. A. (2001). A look across four years at the disposition toward critical thinking among undergraduate students. The Journal of General Education , 29–55.

Girmen, P., Özkanal, Ü., & Dayan, G. (2019). Digital storytelling in the language arts classroom. Universal Journal of Educational Research , 7 (1), 55–65. https://doi.org/10.13189/ujer.2019.070108 .

Gömleksiz, M. N., & Pullu, E. K. (2017). Toondoo Ile dijital hikâyeler oluşturmanın öğrenci başarısına ve tutumlarına Etkisi. Electronic Turkish Studies , 12 (32), 95–110. https://doi.org/10.7827/TurkishStudies.12717 .

Hadwin, A. F., & Oshige, M. (2011). Self-regulation, coregulation, and socially shared regulation: Exploring perspectives of social in self-regulated learning theory. Teachers College Record , 113 (2), 240–264.

Hafner, C. A., & Miller, L. (2011). Fostering learner autonomy in English for science: A collaborative digital video project in a technological learning environment. Language Learning & Technology , 15 (3), 68–86.

Hung, C. M., Hwang, G. J., & Huang, I. (2012). A project-based digital storytelling approach for improving students’ learning motivation, problem-solving competence, and learning achievement. Journal of Educational Technology & Society , 15 (4), 368–379.

Ivala, E., Gachago, D., Condy, J., & Chigona, A. (2013). Enhancing student engagement with their studies: A digital storytelling approach. https://doi.org/10.4236/ce.2013.410A012 .

Jenkins, M., & Lonsdale, J. (2007). Evaluating the effectiveness of digital storytelling for student reflection. In ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007 (pp. 440–444).

Karadağ, Ö., & Maden, S. (2013). Yazma eğitimi: kuram, uygulama, ölçme ve değerlendirme. Türkçe öğretimi el kitabı (1. Baskı)(ss. 265–301). Pegem Akademi Yayınlar&#305.

Karakoyun, F., & Yapıcı, I. Ü. (2016). Use of Digital Storytelling in Biology Teaching. Universal Journal of Educational Research , 4 (4), 895–903. https://doi.org/10.13189/ujer.2016.040427 .

Karami, M., Pakmehr, H., & Aghili, A. (2012). Another view to importance of teaching methods in curriculum: Collaborative learning and students’ critical thinking disposition. Procedia-Social and Behavioral Sciences , 46() , 3266–3270. https://doi.org/10.1016/j.sbspro.2012.06.048 .

Kearney, M. (2009). Towards a learning design for student-generated digital storytelling. The Future of Learning Design Conference . 4.

Kim, D., & Li, M. (2021). Digital storytelling: Facilitating learning and identity development. Journal of Computers in Education , 8 (1), 33–61. https://doi.org/10.1007/s40692-020-00170-9 .

Kılıç, H. E., & Şen, A. İ. (2014). UF/EMI eleştirel düşünme eğilimi ölçeğini Türkçeye Uyarlama çalışması. Eğitim Ve Bilim , 39 (176), 1–12. https://doi.org/10.15390/EB.2014.3632 .

Korucu, A. T. (2020). Fen eğitiminde kullanılan dijital hikâyelerin öğretmen adaylarının akademik başarısı, sayısal yetkinlik durumları ve sorgulama becerileri üzerindeki etkisi. Kastamonu Eğitim Dergisi , 28 (1), 352–370. https://doi.org/10.24106/kefdergi.3617 .

Kulla-Abbott, T. M. (2006). Developing literacy practices through digital storytelling (Publication No. 3285640) [Doctoral dissertation, University of Missouri-St. Louis]. ProQuest Dissertations and Theses Global.

Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia-social and Behavioral Sciences , 31 , 486–490. https://doi.org/10.1016/j.sbspro.2011.12.091 .

Lampert, N. (2007). Critical thinking dispositions as an outcome of undergraduate education. The Journal of General Education , 17–33.

Lantz, J. L., Myers, J., & Wilson, R. (2020). Digital storytelling and young children: Transforming learning through creative use of technology. In handbook of research on integrating digital technology with literacy pedagogies (pp. 212–231). IGI Global. https://doi.org/10.4018/978-1-7998-0246-4.ch010 .

Malita, L., & Martin, C. (2010). Digital storytelling as web passport to success in the 21st century. Procedia-Social and Behavioral Sciences , 2 (2), 3060–3064. https://doi.org/10.1016/j.sbspro.2010.03.465 .

McMillan, J. H., & Schumacher, S. (2010). Research in education: Evidence-based inquiry (7th Edition), MyEducationLab Series. Pearson.

Ming, T. S., Sim, L. Y., Mahmud, N., Kee, L. L., Zabidi, N. A., & Ismail, K. (2014). Enhancing 21st-century learning skills via digital storytelling: Voices of Malaysian teachers and undergraduates. Procedia-Social and Behavioral Sciences , 118 (1), 489–494. https://doi.org/10.1016/j.sbspro.2014.02.067 .

Ministry of Education. (2018). Ministry of National Education curriculum . https://mufredat.meb.gov.tr/Programlar.aspx

Nam, C. W. (2017). The effects of digital storytelling on student achievement, social presence, and attitude in online collaborative learning environments. Interactive Learning Environments , 25 (3), 412–427. https://doi.org/10.1080/10494820.2015.1135173 .

Nordmark, S., & Milrad, M. (2012, March). Mobile digital storytelling for promoting creative collaborative learning. In 2012 IEEE Seventh International Conference on Wireless, Mobile and Ubiquitous Technology in Education (pp. 9–16). IEEE. https://doi.org/10.1109/WMUTE.2012.10 .

Ohler, J. (2006). The world of digital storytelling. Educational Leadership , 63 (4), 44–47.

Ohler, J. B. (2013). Digital storytelling in the classroom: New media pathways to literacy,learning, and creativity (2nd ed.). Corwin.

Oskoz, A., & Elola, I. (2014). Integrating digital stories in the writing class: Towards a 21st-century literacy. Digital Literacies in Foreign Language Education: Research, Perspectives and Best Practices , 179–200.

Özbay, M., & Barutçu, T. (2013). Dil psikolojisi ve Türkçe öğretimi. Adıyaman Üniversitesi Sosyal Bilimler Enstitüsü Dergisi , 6 (11), 933–973.

Pan, V., & Tanrıseven, I. (2016). Öğretmen adaylarının işbirlikli-düzenleme durumlarının çeşitli değişkenler açısından incelenmesi. Mersin Üniversitesi Eğitim Fakültesi Dergisi , 12 (1). https://doi.org/10.17860/efd.86508 .

Reinders, H. (2011). Digital storytelling in the foreign language classroom. ELT World Online Blog , 26 , 1–9.

Robin, B. R. (2008). Digital storytelling: A powerful technology tool for the 21st-century classroom. Theory into Practice , 47 (3), 220–228. https://doi.org/10.1080/00405840802153916 .

Robin, B. R. (2016). The power of digital storytelling to support teaching and learning. Digital Education Review , 30 , 17–29.

Robin, B. (2006, March). The educational uses of digital storytelling. In Society for Information Technology & Teacher Education International Conference (pp. 709–716).

Robin, B. R., & McNeil, S. G. (2012). What educators should know about teaching digital storytelling. Digital Education Review , 37–51.

Sadik, A. (2008). Digital storytelling: A meaningful technology-integrated approach for engaged student learning. Educational Technology Research and Development , 56 (4), 487–506. https://doi.org/10.1007/s11423-008-9091-8 .

Sarıca, H. Ç., & Usluel, Y. K. (2016a). The effect of digital storytelling on visual memory and writing skills. Computers & Education , 94 , 298–309. https://doi.org/10.1016/j.compedu.2015.11.016 .

Sarıca, H. Ç., & Usluel, Y. K. (2016b). Eğitsel bağlamda dijital hikâye anlatımı: Bir rubrik geliştirme çalışması. Eğitim Teknolojisi Kuram Ve Uygulama , 6 (2), 65–84.

Seçer, İ. (2015). SPSS ve LISREL ile pratik veri analizi (2. baskı). Anı Yayıncılık.

Seferoğlu, S. S. (2015). Okullarda teknoloji kullanımı ve uygulamalar: Gözlemler, sorunlar ve çözüm önerileri. Artı Eğitim , 123 , 90–91.

Smeda, N., Dakich, E., & Sharda, N. (2014). The effectiveness of digital storytelling in the classrooms: A comprehensive study. Smart Learning Environments , 1 (1), 1–21. https://doi.org/10.1186/s40561-014-0006-3 .

Sylvester, R., & Greenidge, W. L. (2009). Digital storytelling: Extending the potential for struggling writers. The Reading Teacher , 63 (4), 284–295. https://doi.org/10.1598/RT.63.4.3 .

Tabachnick, B. G., Fidell, L. S., & Ullman, J. B. (2007). Using multivariate statistics (Vol. 5, pp. 481–498). [DX Reader version]. https://www.pearsonhighered.com/ .

Tanrıkulu, F. (2021). Students’ perceptions about the effects of collaborative digital storytelling on writing skills. Computer Assisted Language Learning , 1–16. https://doi.org/10.1080/09588221.2020.1774611 .

Temizkan, M. (2011). Yaratıcı Yazma etkinliklerinin öykü yazma becerisi üzerindeki etkisi. Kuram ve Uygulamada Eğitim Bilimleri Dergisi , 11 (2), 919–940.

Tezci, E., & Perkmen, S. (2016). Oluşturmacı perspektiften teknolojinin öğrenme-öğretme sürecine entegrasyonu. K. Çağıltay, Y. Göktaş (Ed.). Öğretim teknolojilerinin temelleri: teoriler, araştırmalar, eğilimler içinde (2.baskı, ss 193–218). PegemA Yayıncılık.

Toki, E. I., & Pange, J. (2014). ICT use in early childhood education: Storytelling. Tiltai , 66 (1), 183–192.

Turgut, M. F., & Baykul, Y. (2015). Eğitimde Ölçme ve Değerlendirme (7. Baskı). Pegem Akademi.

Wang, S., & Zhan, H. (2010). Enhancing teaching and learning with digital storytelling. International Journal of Information and Communication Technology Education (IJICTE) , 6 (2), 76–87.

Xu, Y., Park, H., & Baek, Y. (2011). A new approach toward digital storytelling: An activity focused on writing self-efficacy in a virtual learning environment. Journal of Educational Technology & Society , 14 (4), 181–191.

Yamaç, A., & Ulusoy, M. (2016). The effect of digital storytelling in ımproving the third graders’ writing skills. International Electronic Journal of Elementary Education , 9 (1), 59–86.

Yang, Y. T. C., & Wu, W. C. I. (2012). Digital storytelling for enhancing student academic achievement, critical thinking, and learning motivation: A year-long experimental study. Computers & Education , 59 (2), 339–352. https://doi.org/10.1016/j.compedu.2011.12.012 .

Yuksel, P., Robin, B., & McNeil, S. (2011, March). Educational uses of digital storytelling all around the world. In Society for Information Technology & Teacher Education International Conference (pp. 1264–1271). Association for the Advancement of Computing in Education (AACE).

Yuksel-Arslan, P., Yildirim, S., & Robin, B. R. (2016). A phenomenological study: Teachers’ experiences of using digital storytelling in early childhood education. Educational Studies , 42 (5), 427–445. https://doi.org/10.1080/03055698.2016.1195717 .

Download references

Acknowledgements

This study was carried out as part of the doctoral thesis entitled “The Effects of Digital Storytelling on High School Students’ Academic Achievements, Critical Thinking Dispositions, Co-Regulations and Narrative Skills” ( Thesis Number: 679745).

Open access funding provided by the Scientific and Technological Research Council of Türkiye (TÜBİTAK).

Author information

Authors and affiliations.

Ministry of National Education, Van, Turkey

Sinan Bilici

Department of Software Engineering, Engineering Faculty, Atatürk University, Erzurum, 25240, Turkey

Rabia Meryem Yilmaz

Department of Computer Education and Instructional Technology, Atatürk University, Erzurum, 25240, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Rabia Meryem Yilmaz .

Ethics declarations

Ethical approval.

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Bilici, S., Yilmaz, R.M. The effects of using collaborative digital storytelling on academic achievement and skill development in biology education. Educ Inf Technol (2024). https://doi.org/10.1007/s10639-024-12638-7

Download citation

Received : 30 May 2023

Accepted : 21 March 2024

Published : 17 April 2024

DOI : https://doi.org/10.1007/s10639-024-12638-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital storytelling
  • Biology education
  • Academic achievement
  • Critical thinking disposition
  • Co-regulation skill
  • Narrative skill
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. The benefits of critical thinking for students and how to develop it

    example of consistency in critical thinking

  2. 10 Essential Critical Thinking Skills (And How to Improve Them

    example of consistency in critical thinking

  3. Critical Thinking Skills

    example of consistency in critical thinking

  4. Critical Thinking Definition, Skills, and Examples

    example of consistency in critical thinking

  5. Ultimate Critical Thinking Cheat Sheet

    example of consistency in critical thinking

  6. Understand Why Consistency is The Key To Success

    example of consistency in critical thinking

VIDEO

  1. Consistency is CRITICAL! #meditation #mindset #manifestation #shorts

  2. What is consistency? 

  3. consistency in studies

  4. Power of Consistency

  5. consistency

  6. Achieving Greatness: The Power of Consistency and Persistence

COMMENTS

  1. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  2. 41+ Critical Thinking Examples (Definition + Practices)

    There are many resources to help you determine if information sources are factual or not. 7. Socratic Questioning. This way of thinking is called the Socrates Method, named after an old-time thinker from Greece. It's about asking lots of questions to understand a topic.

  3. Standards of Critical Thinking

    Consistency is a key aspect of critical thinking. Our beliefs should be consistent. Our beliefs should be consistent. We shouldn't hold beliefs that are contradictory.

  4. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  5. Standards of Critical Thinking

    Clarity is an important standard of critical thought. Clarity of communication is one aspect of this. We must be clear in how we communicate our thoughts, beliefs, and reasons for those beliefs ...

  6. Chapter 23: Consistency

    23 Consistency The Consistency of Your Premise Components Another tool that ensures a strong premise is consistency, which is the way that premise elements support each other. For example, … - Selection from Think Smarter: Critical Thinking to Improve Problem-Solving and Decision-Making Skills [Book]

  7. Critical Thinking Definition, Skills, and Examples

    Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful ...

  8. Critical Thinking

    Approach problems in a consistent and systematic way. ... Sometimes we think in almost any way but critically, for example when our self-control is affected by anger, grief or joy or when we are feeling just plain 'bloody minded'. ... Critical thinking requires a clear, often uncomfortable, assessment of your personal strengths, weaknesses ...

  9. Frontiers

    Introduction. Although there is no consistent definition of critical thinking (CT), it is usually described as "purposeful, self-regulatory judgment that results in interpretation, analysis, evaluation, and inference, as well as explanations of the evidential, conceptual, methodological, criteriological, or contextual considerations that judgment is based upon" (Facione, 1990, p. 2).

  10. 1.6: Basic Logical Concepts

    The ability to construct, identify and evaluate arguments is a crucial part of critical thinking. We all have lots of opinions on lots of things,but most people are not good at giving arguments in support of their opinions. Here is an example of a short argument made up of three statements. The first two statements are the premises, and the ...

  11. What is Critical Thinking?

    Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. Paul and Scriven go on to suggest that ...

  12. Introduction to Logic and Critical Thinking

    This is an introductory textbook in logic and critical thinking. The goal of the textbook is to provide the reader with a set of tools and skills that will enable them to identify and evaluate arguments. The book is intended for an introductory course that covers both formal and informal logic. As such, it is not a formal logic textbook, but is closer to what one would find marketed as a ...

  13. Consistent vs Inconsistent Sets of Claims

    A consistent athlete is one who reliably performs at a certain level regardless of the circumstances. An inconsistent athlete performs well sometimes and not so-well other times, and their performance is hard to predict. This isn't how we use the terms "consistent" and "inconsistent" in logic. In logic, "consistency" is a property ...

  14. 6 Main Types of Critical Thinking Skills (With Examples)

    Critical thinking skills examples. There are six main skills you can develop to successfully analyze facts and situations and come up with logical conclusions: 1. Analytical thinking. Being able to properly analyze information is the most important aspect of critical thinking. This implies gathering information and interpreting it, but also ...

  15. 7.5: Logical Analysis using Truth Tables

    Here's a sort of algorithm for the reverse truth table method of analysis: 1. Symbolize the inference and write out in a single row using slashes between formulas. 2. Assign truth values to each complete formula such that all premises are true and the conclusion is false. 3.

  16. How Do Critical Thinking Ability and Critical Thinking Disposition

    Introduction. Although there is no consistent definition of critical thinking (CT), it is usually described as "purposeful, self-regulatory judgment that results in interpretation, analysis, evaluation, and inference, as well as explanations of the evidential, conceptual, methodological, criteriological, or contextual considerations that judgment is based upon" (Facione, 1990, p. 2).

  17. 10 examples of critical thinking that changed the world

    They are critical thinkers. 1. Albert Einstein. C.P. Snow put it best: "One of [Einstein's] greatest intellectual gifts, in small matters as well as great, was to strip off the irrelevant frills from a problem.". (From Einstein: The First Hundred Years) If you take one critical thinking tip from Einstein, make it….

  18. An Evaluative Review of Barriers to Critical Thinking in Educational

    1. Introduction. Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; Quinn et al. 2020).

  19. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyse information and form a judgement. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources.

  20. A Three-Level Model for Critical Thinking: Critical Alertness, Critical

    This example is a prototype of critical thinking (CT). Frankfurt does not say that president Obama is lying or faking facts, rather he emphasizes three other things: (a) With respect to justice, there are some criteria of reversibility and autonomy that must be taken into account, (b) Justice is not a question of taking something away from ...

  21. Understanding the Complex Relationship between Critical Thinking and

    This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, "understanding the process of science" requires students to engage in (and be metacognitive about) scientific reasoning, and having the "ability to interpret data" requires critical-thinking skills.

  22. CT 1

    EXTERNAL CONSISTENCY relates to what you and your audience agree is true of the real world. Unlike internal consistency, external consistency is a relative notion, since it can and does vary from audience to audience. 3) This is one of the major reasons why the selection of audience is so critical in any communication situation.

  23. Structured Thinking: Bringing Consistency to Problem Management

    Using a consistent approach focused on critical data coupled with a visual representation of this thinking tremendously improves communication and collaboration. Besides significantly advancing the search for true root cause. Here is an example of information that was recorded in a typical, freeform field:

  24. 12 Examples of Consistency

    Here are 12 examples of consistency: 1. Coming into work every day on time. An excellent example of consistency is someone who comes to work every day on time and is ready to start. The person sends a message to their boss and colleagues that they are dependable and can be counted on to get the job done.

  25. Critical Thinking: Definition and Analysis

    Essentially, critical thinking acts as intellectual armor, equipping individuals with the tools to navigate the complex landscape of information and ideas in today's world. One of the foundational elements of critical thinking is analysis. Analysis entails breaking down complex ideas or issues into their constituent parts, closely examining ...

  26. What is the Decision-Making Process? Definition, Steps, Examples, and

    Ethical Decision-Making Process. Ethical decision-making involves considering moral principles, values, and standards when making choices. Here's a structured approach to ethical decision-making: 1. Identify the Ethical Issue: Recognize that there is an ethical dilemma or decision to be made.

  27. The effects of using collaborative digital storytelling on ...

    The purpose of the study is to investigate the effect of the use of digital storytelling on academic achievement, critical thinking dispositions, co-regulation, and narrative skills of 10th grade students. To this end, the study was conducted using a semi-experimental design with a convenience sample. The participants consisted of 64 students (33 in experimental and 31 in control group) who ...

  28. Consistency®: How to Feel More Energized at Work

    Below are audio and video plus a transcript of the conversation, including time stamps. Productive employees want energy, motivation and drive to characterize their work life. Managers want their ...

  29. The Newest Vital Sign

    A Health Literacy Assessment Tool for Patient Care and Research The Newest Vital Sign (NVS) is a valid and reliable screening tool available in English and Spanish that identifies patients at risk for low health literacy. It is easy and quick to administer, requiring just three minutes. In clinical settings, the test allows providers to appropriately adapt their communication practices to the ...