The Classroom | Empowering Students in Their College Journey

The Relationship Between Scientific Method & Critical Thinking

Scott Neuffer

What Is the Function of the Hypothesis?

Critical thinking, that is the mind’s ability to analyze claims about the world, is the intellectual basis of the scientific method. The scientific method can be viewed as an extensive, structured mode of critical thinking that involves hypothesis, experimentation and conclusion.

Critical Thinking

Broadly speaking, critical thinking is any analytical thought aimed at determining the validity of a specific claim. It can be as simple as a nine-year-old questioning a parent’s claim that Santa Claus exists, or as complex as physicists questioning the relativity of space and time. Critical thinking is the point when the mind turns in opposition to an accepted truth and begins analyzing its underlying premises. As American philosopher John Dewey said, it is the “active, persistent and careful consideration of a belief or supposed form of knowledge in light of the grounds that support it, and the further conclusions to which it tends.”

Critical thinking initiates the act of hypothesis. In the scientific method, the hypothesis is the initial supposition, or theoretical claim about the world, based on questions and observations. If critical thinking asks the question, then the hypothesis is the best attempt at the time to answer the question using observable phenomenon. For example, an astrophysicist may question existing theories of black holes based on his own observation. He may posit a contrary hypothesis, arguing black holes actually produce white light. It is not a final conclusion, however, as the scientific method requires specific forms of verification.

Experimentation

The scientific method uses formal experimentation to analyze any hypothesis. The rigorous and specific methodology of experimentation is designed to gather unbiased empirical evidence that either supports or contradicts a given claim. Controlled variables are used to provide an objective basis of comparison. For example, researchers studying the effects of a certain drug may provide half the test population with a placebo pill and the other half with the real drug. The effects of the real drug can then be assessed relative to the control group.

In the scientific method, conclusions are drawn only after tested, verifiable evidence supports them. Even then, conclusions are subject to peer review and often retested before general consensus is reached. Thus, what begins as an act of critical thinking becomes, in the scientific method, a complex process of testing the validity of a claim. English philosopher Francis Bacon put it this way: “If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts, he shall end in certainties.”

Related Articles

According to the Constitution, What Power Is Denied to the Judicial Branch?

According to the Constitution, What Power Is Denied to the Judicial ...

How to Evaluate Statistical Analysis

How to Evaluate Statistical Analysis

The Disadvantages of Qualitative & Quantitative Research

The Disadvantages of Qualitative & Quantitative Research

Qualitative and Quantitative Research Methods

Qualitative and Quantitative Research Methods

What Is Experimental Research Design?

What Is Experimental Research Design?

The Parts of an Argument

The Parts of an Argument

What Is a Confirmed Hypothesis?

What Is a Confirmed Hypothesis?

The Formula for T Scores

The Formula for T Scores

  • How We Think: John Dewey
  • The Advancement of Learning: Francis Bacon

Scott Neuffer is an award-winning journalist and writer who lives in Nevada. He holds a bachelor's degree in English and spent five years as an education and business reporter for Sierra Nevada Media Group. His first collection of short stories, "Scars of the New Order," was published in 2014.

Back Home

  • Search Search Search …
  • Search Search …

What’s the Difference Between Critical Thinking and Scientific Thinking?

critical thinking and scientific thinking

Thinking deeply about things is a defining feature of what it means to be human, but, surprising as it may seem, there isn’t just one way to ‘think’ about something; instead, humans have been developing organized and varied schools of thought for thousands of years.

Discussions about morality, religion, and the meaning of life often drive knowledge-seeking inquiry, leading people to wonder what the difference is between critical thinking and Scientific Thinking.

Critical thinkers prioritize objectivity to analyze a problem, deduce logical solutions, and examine what the ramifications of those solutions are.

While scientific thinking often relies heavily on critical thinking, scientific inquiry is more dedicated to acquiring knowledge rather than mere abstraction.

There are a lot of nuances between critical thinking and scientific thinking, and most of us probably utilize these skills in our everyday lives. The rest of this article will thoroughly define the two terms and relate how they are similar and different.

What Is Critical Thinking?

Critical thinking is a mindset ― a lens, if you will, through which one may view the world. Critical thinkers rely on a lot of introspection, constantly self-evaluating how they came to a conclusion, and what that conclusion naturally entails.

A critical thinker may discern what they already know about a subject, what that information suggests, why that information is relevant, and how that information could be linked to further lines of inquiry. Critical thinking is, therefore, simply the ability to think clearly and logically.

Systematic reasoning is prized over gut instinct, and determining relevance is crucial to parsing out useful data from extraneous information.

Naturally, the ability to think critically is highly prized in an academic setting, and most educators seek to enable their students to think critically.

What is the link between the styles and motivations of these two Romantic era poets? How can your current understanding of algebra be applied to geometry? How does our understanding of this historical figure influence our understanding of social life at the time?

So much information can be interlinked to develop our understanding of the world, and critical thinking is the basis for using objectivity to not only establish likely outcomes to a scenario, but also inquire on the repercussions of that outcome and reflect on the process by which one came to that conclusion.

What Is Scientific Thinking?

The objective of scientific thinking is the acquisition of knowledge. The more we know, the more we can hope to know.

Scientific thinking begins by imagining what the outcome of a problem may be, observing the situation, and then making notes and changing the initial hypothesis.

The commonly used scientific method is as follows:

  • Define the purpose of the experiment
  • Formulate a hypothesis
  • Study the phenomenon and collect data
  • Draw results

As you might imagine, this process can be repeated ad infinitum. So, you draw a conclusion that’s scientifically verifiable? Great! Now you can take that conclusion and use it as a basis for a new experiment. Of course, the scientific method has limits.

It’s hard to apply the scientific method when it comes to morality or religious beliefs. A revelation of a prophet cannot be empirically verified.

We can’t go inside said prophet’s mind and see exactly what neurons were firing to recreate the conditions under which the vision was made, and even if we could, the nature of such a revelation is spiritual and immaterial.

It’s impossible to influence the supernatural in the material world, and as such, creating a test that relies on changing something to see the outcome is impossible. Where scientific thinking does excel is in the fields of math and, well, science.

Physics is known as the perfect science because the forces that comprise our world are well understood and don’t tend to exhibit anomalies, making the empirically verified scientific method perfect for improving our understanding of the natural world.

How Are Critical Thinking and Scientific Thinking Similar and Different?

Both critical and scientific thinking rely on the use of empirical, objective evidence. Thinking scientifically or critically relies on using the data available and following it to its likely conclusion.

Scientific thinking can be seen as a stricter, more regulated version of critical thinking. It takes the tenets of critically thinking and narrows the focus.

Both fields of study eschew personal bias and gut instinct as both unreliable and unhelpful.

The main difference between the two, however, is the goal of each discipline.

While both prioritize learning and using data to make hypotheses, critical thinking is prone to much more abstraction and self-reflection.

With little variation in the scientific method, there’s not really any need to reflect on how those conclusions were drawn or if those conclusions are a result of any kind of bias. It’s just not useful information.

For a critical thinker, however, self-reflection is key to identifying inconsistencies and refining one’s argument.

Both scientific thinking and critical thinking tend to draw links between concepts, evaluating how they are related and what knowledge may be gleaned from that connection.

While critical thinking can be applied to most concepts, even those of morality and anthropology, scientific thinking is often problem oriented. If a problem exists, scientific inquiry attempts to gain the necessary information to solve it, overcoming obstacles along the way.

Both critical thinkers and scientific thinkers may very well end up at the same conclusion― they will just draw those conclusions differently. Critical thinkers are concerned with logic, order, and rational thinking.

Establishing already-understood information, applying that information to a query, and then establishing a defensible argument on the accuracy and relevance of the conclusion is the trademark of a critical thinker. Scientific thinkers, on the other hand, work towards solving knowledge almost exclusively through the acquisition of knowledge through the scientific method.

Scientific thinkers develop a hypothesis, test it, and then rinse and repeat until the phenomenon is understood. As such, scientific thinkers are obsessed with why questions. Why does this phenomenon happen?

Why does matter behave like this? In the end, both schools are thought have a lot of interesting ideas guiding them, and most of us probably use them throughout our daily lives.

https://www.vwaust.com/resource/what-is-scientific-thinking/

https://www.skillsyouneed.com/learn/critical-thinking.html#:~:text=Critical%20thinking%20is%20thinking%20about%20things%20in%20certain,to%20the%20best%20possible%20conclusion.%20Critical%20Thinking%20is%3A

https://psycnet.apa.org/record/2010-22950-019

You may also like

Examples Of Critical Thinking In The Classroom

Examples of critical thinking in the classroom

Most people are aware that critical thinking is an enormously important skill, both in education and in real life. However, you might […]

critical thinking and stoicism

Critical Thinking and Stoicism

To be Stoic. We’ve all see it before, the calm, cool mind working at a level that no one else can even […]

What is Scenario Thinking?

What is Scenario Thinking?

In a world that is becoming increasingly more unpredictable, planning for the future has become even more of a challenge than it […]

AI, Artificial intelligence

How to Use Artificial Intelligence in the Critical Thinking Process: Enhancing Human Decision-Making

Artificial Intelligence (AI) is revolutionizing the way critical thinking is approached in various sectors. By integrating AI into the critical thinking process, […]

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

  • Jason E. Dowd
  • Robert J. Thompson
  • Leslie A. Schiff
  • Julie A. Reynolds

*Address correspondence to: Jason E. Dowd ( E-mail Address: [email protected] ).

Department of Biology, Duke University, Durham, NC 27708

Search for more papers by this author

Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

ACKNOWLEDGMENTS

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science . ( 2011 ). Vision and change in undergraduate biology education: A call to action . Washington, DC Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . Google Scholar
  • August, D. ( 2016 ). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. Google Scholar
  • Beyer, C. H., Taylor, E., & Gillmore, G. M. ( 2013 ). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. Google Scholar
  • Bissell, A. N., & Lemons, P. P. ( 2006 ). A new method for assessing critical thinking in the classroom . BioScience , 56 (1), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . Google Scholar
  • Blattner, N. H., & Frazier, C. L. ( 2002 ). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , 8 (1), 47–64. Google Scholar
  • Clase, K. L., Gundlach, E., & Pelaez, N. J. ( 2010 ). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , 38 (5), 290–295. Medline ,  Google Scholar
  • Condon, W., & Kelly-Riley, D. ( 2004 ). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , 9 (1), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . Google Scholar
  • Ding, L., Wei, X., & Liu, X. ( 2016 ). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , 46 (5), 613–632. https://doi.org/10.1007/s11165-015-9473-y . Google Scholar
  • Dowd, J. E., Connolly, M. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015a ). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , 46 (1), 14–27. https://doi.org/10.1080/00220485.2014.978924 . Google Scholar
  • Dowd, J. E., Roy, C. P., Thompson, R. J.Jr., & Reynolds, J. A. ( 2015b ). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , 92 (1), 39–45. https://doi.org/10.1021/ed500298r . Google Scholar
  • Dowd, J. E., Thompson, R. J.Jr., & Reynolds, J. A. ( 2016 ). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , 27 , 36–51. Google Scholar
  • Facione, P. A. ( 1990 ). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association. Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . Google Scholar
  • Gerdeman, R. D., Russell, A. A., Worden, K. J., Gerdeman, R. D., Russell, A. A., & Worden, K. J. ( 2007 ). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , 36 (5), 46–52. Google Scholar
  • Greenhoot, A. F., Semb, G., Colombo, J., & Schreiber, T. ( 2004 ). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , 18 (2), 203–221. https://doi.org/10.1002/acp.959 . Google Scholar
  • Haaga, D. A. F. ( 1993 ). Peer review of term papers in graduate psychology courses . Teaching of Psychology , 20 (1), 28–32. https://doi.org/10.1207/s15328023top2001_5 . Google Scholar
  • Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., … Whitlock, K. ( 2003 ). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , 30 (3), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . Google Scholar
  • Hand, B., & Keys, C. W. ( 1999 ). Inquiry investigation . Science Teacher , 66 (4), 27–29. Google Scholar
  • Holm, S. ( 1979 ). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , 6 (2), 65–70. Google Scholar
  • Holyoak, K. J., & Morrison, R. G. ( 2005 ). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. Google Scholar
  • Insight Assessment . ( 2016a ). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST . Google Scholar
  • Insight Assessment . ( 2016b ). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 . Google Scholar
  • Kelly, G. J., & Takao, A. ( 2002 ). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , 86 (3), 314–342. https://doi.org/10.1002/sce.10024 . Google Scholar
  • Kuhn, D., & Dean, D.Jr. ( 2004 ). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , 5 (2), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . Google Scholar
  • Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. ( 2008 ). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , 23 (4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . Google Scholar
  • Lawson, A. E. ( 2010 ). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , 94 (2), 336–364. https://doi.org/­10.1002/sce.20357 . Google Scholar
  • Meizlish, D., LaVaque-Manty, D., Silver, N., & Kaplan, M. ( 2013 ). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson, R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. Google Scholar
  • Miri, B., David, B.-C., & Uri, Z. ( 2007 ). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , 37 (4), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . Google Scholar
  • Moshman, D. ( 2015 ). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. Google Scholar
  • National Research Council . ( 2000 ). How people learn: Brain, mind, experience, and school . Expanded ed.. Washington, DC: National Academies Press. Google Scholar
  • Pukkila, P. J. ( 2004 ). Introducing student inquiry in large introductory genetics classes . Genetics , 166 (1), 11–18. https://doi.org/10.1534/genetics.166.1.11 . Medline ,  Google Scholar
  • Quitadamo, I. J., Faiola, C. L., Johnson, J. E., & Kurtz, M. J. ( 2008 ). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , 7 (3), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . Link ,  Google Scholar
  • Quitadamo, I. J., & Kurtz, M. J. ( 2007 ). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , 6 (2), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . Link ,  Google Scholar
  • Reynolds, J. A., Smith, R., Moskovitz, C., & Sayle, A. ( 2009 ). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , 59 (10), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson, R. J. ( 2012 ). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , 11 (1), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . Link ,  Google Scholar
  • Reynolds, J. A., & Thompson, R. J. ( 2011 ). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , 10 (2), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . Link ,  Google Scholar
  • Rhemtulla, M., Brosseau-Liard, P. E., & Savalei, V. ( 2012 ). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , 17 (3), 354–373. https://doi.org/­10.1037/a0029315 . Medline ,  Google Scholar
  • Stephenson, N. S., & Sadler-McKnight, N. P. ( 2016 ). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , 17 (1), 72–79. https://doi.org/­10.1039/C5RP00102A . Google Scholar
  • Tariq, V. N., Stefani, L. A. J., Butcher, A. C., & Heylings, D. J. A. ( 1998 ). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , 23 (3), 221–240. https://doi.org/­10.1080/0260293980230301 . Google Scholar
  • Timmerman, B. E. C., Strickland, D. C., Johnson, R. L., & Payne, J. R. ( 2011 ). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , 36 (5), 509–547. https://doi.org/10.1080/­02602930903540991 . Google Scholar
  • Topping, K. J., Smith, E. F., Swanson, I., & Elliot, A. ( 2000 ). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , 25 (2), 149–169. https://doi.org/10.1080/713611428 . Google Scholar
  • Willison, J., & O’Regan, K. ( 2007 ). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , 26 (4), 393–409. https://doi.org/10.1080/07294360701658609 . Google Scholar
  • Woodin, T., Carter, V. C., & Fletcher, L. ( 2010 ). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , 9 (2), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . Link ,  Google Scholar
  • Zeineddin, A., & Abd-El-Khalick, F. ( 2010 ). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , 47 (9), 1064–1093. https://doi.org/10.1002/tea.20368 . Google Scholar
  • Zimmerman, C. ( 2000 ). The development of scientific reasoning skills . Developmental Review , 20 (1), 99–149. https://doi.org/10.1006/drev.1999.0497 . Google Scholar
  • Zimmerman, C. ( 2007 ). The development of scientific thinking skills in elementary and middle school . Developmental Review , 27 (2), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . Google Scholar
  • Gender, Equity, and Science Writing: Examining Differences in Undergraduate Life Science Majors’ Attitudes toward Writing Lab Reports 6 March 2024 | Education Sciences, Vol. 14, No. 3
  • Designing a framework to improve critical reflection writing in teacher education using action research 24 February 2022 | Educational Action Research, Vol. 32, No. 1
  • Scientific Thinking and Critical Thinking in Science Education  5 September 2023 | Science & Education, Vol. 11
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Critical thinking during science investigations: what do practicing teachers value and observe? 16 March 2023 | Teachers and Teaching, Vol. 29, No. 6
  • Effect of Web-Based Collaborative Learning Method with Scratch on Critical Thinking Skills of 5th Grade Students 30 March 2023 | Participatory Educational Research, Vol. 10, No. 2
  • Are We on the Way to Successfully Educating Future Citizens?—A Spotlight on Critical Thinking Skills and Beliefs about the Nature of Science among Pre-Service Biology Teachers in Germany 22 March 2023 | Behavioral Sciences, Vol. 13, No. 3
  • A Systematic Review on Inquiry-Based Writing Instruction in Tertiary Settings 30 November 2022 | Written Communication, Vol. 40, No. 1
  • An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills 8 December 2022 | SN Social Sciences, Vol. 2, No. 12
  • TEACHING OF CRITICAL THINKING SKILLS BY SCIENCE TEACHERS IN JAPANESE PRIMARY SCHOOLS 25 October 2022 | Journal of Baltic Science Education, Vol. 21, No. 5
  • A Team-Based Activity to Support Knowledge Transfer and Experimental Design Skills of Undergraduate Science Students 4 May 2022 | Journal of Microbiology & Biology Education, Vol. 21
  • Curriculum Design of College Students’ English Critical Ability in the Internet Age Wireless Communications and Mobile Computing, Vol. 2022
  • Exploring the structure of students’ scientific higher order thinking in science education Thinking Skills and Creativity, Vol. 43
  • The Asia-Pacific Education Researcher, Vol. 31, No. 4
  • Conspiratorial Beliefs and Cognitive Styles: An Integrated Look on Analytic Thinking, Critical Thinking, and Scientific Reasoning in Relation to (Dis)trust in Conspiracy Theories 12 October 2021 | Frontiers in Psychology, Vol. 12
  • Professional Knowledge and Self-Efficacy Expectations of Pre-Service Teachers Regarding Scientific Reasoning and Diagnostics 11 October 2021 | Education Sciences, Vol. 11, No. 10
  • Developing textbook based on scientific approach, critical thinking, and science process skills Journal of Physics: Conference Series, Vol. 1839, No. 1
  • Using Models of Cognitive Development to Design College Learning Experiences
  • Thinking Skills and Creativity, Vol. 42
  • Assessing students’ prior knowledge on critical thinking skills in the biology classroom: Has it already been good?
  • Critical Thinking Level among Medical Sciences Students in Iran Education Research International, Vol. 2020
  • Teaching during a pandemic: Using high‐impact writing assignments to balance rigor, engagement, flexibility, and workload 12 October 2020 | Ecology and Evolution, Vol. 10, No. 22
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices Neuroscience Letters, Vol. 737
  • Developing critical thinking skills assessment for pre-service elementary school teacher about the basic concept of science: validity and reliability Journal of Physics: Conference Series, Vol. 1567, No. 2
  • Challenging endocrinology students with a critical-thinking workbook Advances in Physiology Education, Vol. 44, No. 1
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Rebecca Price, Monitoring Editor
  • Kari L. Nelson ,
  • Claudia M. Rauter , and
  • Christine E. Cutucache
  • Elisabeth Schussler, Monitoring Editor

Submitted: 17 March 2017 Revised: 19 October 2017 Accepted: 20 October 2017

© 2018 J. E. Dowd et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical Literature
  • Classical Reception
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Archaeology
  • Greek and Roman Papyrology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Agriculture
  • History of Education
  • History of Emotions
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Variation
  • Language Families
  • Language Evolution
  • Language Reference
  • Language Acquisition
  • Lexicography
  • Linguistic Theories
  • Linguistic Typology
  • Linguistic Anthropology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Modernism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Culture
  • Music and Media
  • Music and Religion
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Science
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Society
  • Law and Politics
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Clinical Neuroscience
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Oncology
  • Medical Toxicology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Medical Ethics
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Games
  • Computer Security
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Neuroscience
  • Cognitive Psychology
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business History
  • Business Ethics
  • Business Strategy
  • Business and Technology
  • Business and Government
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Methodology
  • Economic History
  • Economic Systems
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Theory
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Politics and Law
  • Public Policy
  • Public Administration
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Thinking and Reasoning

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Thinking and Reasoning

35 Scientific Thinking and Reasoning

Kevin N. Dunbar, Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD

David Klahr, Department of Psychology, Carnegie Mellon University, Pittsburgh, PA

  • Published: 21 November 2012
  • Cite Icon Cite
  • Permissions Icon Permissions

Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research. Future research will focus on the collaborative aspects of scientific thinking, on effective methods for teaching science, and on the neural underpinnings of the scientific mind.

There is no unitary activity called “scientific discovery”; there are activities of designing experiments, gathering data, inventing and developing observational instruments, formulating and modifying theories, deducing consequences from theories, making predictions from theories, testing theories, inducing regularities and invariants from data, discovering theoretical constructs, and others. — Simon, Langley, & Bradshaw, 1981 , p. 2

What Is Scientific Thinking and Reasoning?

There are two kinds of thinking we call “scientific.” The first, and most obvious, is thinking about the content of science. People are engaged in scientific thinking when they are reasoning about such entities and processes as force, mass, energy, equilibrium, magnetism, atoms, photosynthesis, radiation, geology, or astrophysics (and, of course, cognitive psychology!). The second kind of scientific thinking includes the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. However, these reasoning processes are not unique to scientific thinking: They are the very same processes involved in everyday thinking. As Einstein put it:

The scientific way of forming concepts differs from that which we use in our daily life, not basically, but merely in the more precise definition of concepts and conclusions; more painstaking and systematic choice of experimental material, and greater logical economy. (The Common Language of Science, 1941, reprinted in Einstein, 1950 , p. 98)

Nearly 40 years after Einstein's remarkably insightful statement, Francis Crick offered a similar perspective: that great discoveries in science result not from extraordinary mental processes, but rather from rather common ones. The greatness of the discovery lies in the thing discovered.

I think what needs to be emphasized about the discovery of the double helix is that the path to it was, scientifically speaking, fairly commonplace. What was important was not the way it was discovered , but the object discovered—the structure of DNA itself. (Crick, 1988 , p. 67; emphasis added)

Under this view, scientific thinking involves the same general-purpose cognitive processes—such as induction, deduction, analogy, problem solving, and causal reasoning—that humans apply in nonscientific domains. These processes are covered in several different chapters of this handbook: Rips, Smith, & Medin, Chapter 11 on induction; Evans, Chapter 8 on deduction; Holyoak, Chapter 13 on analogy; Bassok & Novick, Chapter 21 on problem solving; and Cheng & Buehner, Chapter 12 on causality. One might question the claim that the highly specialized procedures associated with doing science in the “real world” can be understood by investigating the thinking processes used in laboratory studies of the sort described in this volume. However, when the focus is on major scientific breakthroughs, rather than on the more routine, incremental progress in a field, the psychology of problem solving provides a rich source of ideas about how such discoveries might occur. As Simon and his colleagues put it:

It is understandable, if ironic, that ‘normal’ science fits … the description of expert problem solving, while ‘revolutionary’ science fits the description of problem solving by novices. It is understandable because scientific activity, particularly at the revolutionary end of the continuum, is concerned with the discovery of new truths, not with the application of truths that are already well-known … it is basically a journey into unmapped terrain. Consequently, it is mainly characterized, as is novice problem solving, by trial-and-error search. The search may be highly selective—but it reaches its goal only after many halts, turnings, and back-trackings. (Simon, Langley, & Bradshaw, 1981 , p. 5)

The research literature on scientific thinking can be roughly categorized according to the two types of scientific thinking listed in the opening paragraph of this chapter: (1) One category focuses on thinking that directly involves scientific content . Such research ranges from studies of young children reasoning about the sun-moon-earth system (Vosniadou & Brewer, 1992 ) to college students reasoning about chemical equilibrium (Davenport, Yaron, Klahr, & Koedinger, 2008 ), to research that investigates collaborative problem solving by world-class researchers in real-world molecular biology labs (Dunbar, 1995 ). (2) The other category focuses on “general” cognitive processes, but it tends to do so by analyzing people's problem-solving behavior when they are presented with relatively complex situations that involve the integration and coordination of several different types of processes, and that are designed to capture some essential features of “real-world” science in the psychology laboratory (Bruner, Goodnow, & Austin, 1956 ; Klahr & Dunbar, 1988 ; Mynatt, Doherty, & Tweney, 1977 ).

There are a number of overlapping research traditions that have been used to investigate scientific thinking. We will cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research.

A Brief History of Research on Scientific Thinking

Science is often considered one of the hallmarks of the human species, along with art and literature. Illuminating the thought processes used in science thus reveal key aspects of the human mind. The thought processes underlying scientific thinking have fascinated both scientists and nonscientists because the products of science have transformed our world and because the process of discovery is shrouded in mystery. Scientists talk of the chance discovery, the flash of insight, the years of perspiration, and the voyage of discovery. These images of science have helped make the mental processes underlying the discovery process intriguing to cognitive scientists as they attempt to uncover what really goes on inside the scientific mind and how scientists really think. Furthermore, the possibilities that scientists can be taught to think better by avoiding mistakes that have been clearly identified in research on scientific thinking, and that their scientific process could be partially automated, makes scientific thinking a topic of enduring interest.

The cognitive processes underlying scientific discovery and day-to-day scientific thinking have been a topic of intense scrutiny and speculation for almost 400 years (e.g., Bacon, 1620 ; Galilei 1638 ; Klahr 2000 ; Tweney, Doherty, & Mynatt, 1981 ). Understanding the nature of scientific thinking has been a central issue not only for our understanding of science but also for our understating of what it is to be human. Bacon's Novumm Organum in 1620 sketched out some of the key features of the ways that experiments are designed and data interpreted. Over the ensuing 400 years philosophers and scientists vigorously debated about the appropriate methods that scientists should use (see Giere, 1993 ). These debates over the appropriate methods for science typically resulted in the espousal of a particular type of reasoning method, such as induction or deduction. It was not until the Gestalt psychologists began working on the nature of human problem solving, during the 1940s, that experimental psychologists began to investigate the cognitive processes underlying scientific thinking and reasoning.

The Gestalt psychologist Max Wertheimer pioneered the investigation of scientific thinking (of the first type described earlier: thinking about scientific content ) in his landmark book Productive Thinking (Wertheimer, 1945 ). Wertheimer spent a considerable amount of time corresponding with Albert Einstein, attempting to discover how Einstein generated the concept of relativity. Wertheimer argued that Einstein had to overcome the structure of Newtonian physics at each step in his theorizing, and the ways that Einstein actually achieved this restructuring were articulated in terms of Gestalt theories. (For a recent and different account of how Einstein made his discovery, see Galison, 2003 .) We will see later how this process of overcoming alternative theories is an obstacle that both scientists and nonscientists need to deal with when evaluating and theorizing about the world.

One of the first investigations of scientific thinking of the second type (i.e., collections of general-purpose processes operating on complex, abstract, components of scientific thought) was carried out by Jerome Bruner and his colleagues at Harvard (Bruner et al., 1956 ). They argued that a key activity engaged in by scientists is to determine whether a particular instance is a member of a category. For example, a scientist might want to discover which substances undergo fission when bombarded by neutrons and which substances do not. Here, scientists have to discover the attributes that make a substance undergo fission. Bruner et al. saw scientific thinking as the testing of hypotheses and the collecting of data with the end goal of determining whether something is a member of a category. They invented a paradigm where people were required to formulate hypotheses and collect data that test their hypotheses. In one type of experiment, the participants were shown a card such as one with two borders and three green triangles. The participants were asked to determine the concept that this card represented by choosing other cards and getting feedback from the experimenter as to whether the chosen card was an example of the concept. In this case the participant may have thought that the concept was green and chosen a card with two green squares and one border. If the underlying concept was green, then the experimenter would say that the card was an example of the concept. In terms of scientific thinking, choosing a new card is akin to conducting an experiment, and the feedback from the experimenter is similar to knowing whether a hypothesis is confirmed or disconfirmed. Using this approach, Bruner et al. identified a number of strategies that people use to formulate and test hypotheses. They found that a key factor determining which hypothesis-testing strategy that people use is the amount of memory capacity that the strategy takes up (see also Morrison & Knowlton, Chapter 6 ; Medin et al., Chapter 11 ). Another key factor that they discovered was that it was much more difficult for people to discover negative concepts (e.g., not blue) than positive concepts (e.g., blue). Although Bruner et al.'s research is most commonly viewed as work on concepts, they saw their work as uncovering a key component of scientific thinking.

A second early line of research on scientific thinking was developed by Peter Wason and his colleagues (Wason, 1968 ). Like Bruner et al., Wason saw a key component of scientific thinking as being the testing of hypotheses. Whereas Bruner et al. focused on the different types of strategies that people use to formulate hypotheses, Wason focused on whether people adopt a strategy of trying to confirm or disconfirm their hypotheses. Using Popper's ( 1959 ) theory that scientists should try and falsify rather than confirm their hypotheses, Wason devised a deceptively simple task in which participants were given three numbers, such as 2-4-6, and were asked to discover the rule underlying the three numbers. Participants were asked to generate other triads of numbers and the experimenter would tell the participant whether the triad was consistent or inconsistent with the rule. They were told that when they were sure they knew what the rule was they should state it. Most participants began the experiment by thinking that the rule was even numbers increasing by 2. They then attempted to confirm their hypothesis by generating a triad like 8-10-12, then 14-16-18. These triads are consistent with the rule and the participants were told yes, that the triads were indeed consistent with the rule. However, when they proposed the rule—even numbers increasing by 2—they were told that the rule was incorrect. The correct rule was numbers of increasing magnitude! From this research, Wason concluded that people try to confirm their hypotheses, whereas normatively speaking, they should try to disconfirm their hypotheses. One implication of this research is that confirmation bias is not just restricted to scientists but is a general human tendency.

It was not until the 1970s that a general account of scientific reasoning was proposed. Herbert Simon, often in collaboration with Allan Newell, proposed that scientific thinking is a form of problem solving. He proposed that problem solving is a search in a problem space. Newell and Simon's theory of problem solving is discussed in many places in this handbook, usually in the context of specific problems (see especially Bassok & Novick, Chapter 21 ). Herbert Simon, however, devoted considerable time to understanding many different scientific discoveries and scientific reasoning processes. The common thread in his research was that scientific thinking and discovery is not a mysterious magical process but a process of problem solving in which clear heuristics are used. Simon's goal was to articulate the heuristics that scientists use in their research at a fine-grained level. By constructing computer programs that simulated the process of several major scientific discoveries, Simon and colleagues were able to articulate the specific computations that scientists could have used in making those discoveries (Langley, Simon, Bradshaw, & Zytkow, 1987 ; see section on “Computational Approaches to Scientific Thinking”). Particularly influential was Simon and Lea's ( 1974 ) work demonstrating that concept formation and induction consist of a search in two problem spaces: a space of instances and a space of rules. This idea has influenced problem-solving accounts of scientific thinking that will be discussed in the next section.

Overall, the work of Bruner, Wason, and Simon laid the foundations for contemporary research on scientific thinking. Early research on scientific thinking is summarized in Tweney, Doherty and Mynatt's 1981 book On Scientific Thinking , where they sketched out many of the themes that have dominated research on scientific thinking over the past few decades. Other more recent books such as Cognitive Models of Science (Giere, 1993 ), Exploring Science (Klahr, 2000 ), Cognitive Basis of Science (Carruthers, Stich, & Siegal, 2002 ), and New Directions in Scientific and Technical Thinking (Gorman, Kincannon, Gooding, & Tweney, 2004 ) provide detailed analyses of different aspects of scientific discovery. Another important collection is Vosnadiau's handbook on conceptual change research (Vosniadou, 2008 ). In this chapter, we discuss the main approaches that have been used to investigate scientific thinking.

How does one go about investigating the many different aspects of scientific thinking? One common approach to the study of the scientific mind has been to investigate several key aspects of scientific thinking using abstract tasks designed to mimic some essential characteristics of “real-world” science. There have been numerous methodologies that have been used to analyze the genesis of scientific concepts, theories, hypotheses, and experiments. Researchers have used experiments, verbal protocols, computer programs, and analyzed particular scientific discoveries. A more recent development has been to increase the ecological validity of such research by investigating scientists as they reason “live” (in vivo studies of scientific thinking) in their own laboratories (Dunbar, 1995 , 2002 ). From a “Thinking and Reasoning” standpoint the major aspects of scientific thinking that have been most actively investigated are problem solving, analogical reasoning, hypothesis testing, conceptual change, collaborative reasoning, inductive reasoning, and deductive reasoning.

Scientific Thinking as Problem Solving

One of the primary goals of accounts of scientific thinking has been to provide an overarching framework to understand the scientific mind. One framework that has had a great influence in cognitive science is that scientific thinking and scientific discovery can be conceived as a form of problem solving. As noted in the opening section of this chapter, Simon ( 1977 ; Simon, Langley, & Bradshaw, 1981 ) argued that both scientific thinking in general and problem solving in particular could be thought of as a search in a problem space. A problem space consists of all the possible states of a problem and all the operations that a problem solver can use to get from one state to the next. According to this view, by characterizing the types of representations and procedures that people use to get from one state to another it is possible to understand scientific thinking. Thus, scientific thinking can be characterized as a search in various problem spaces (Simon, 1977 ). Simon investigated a number of scientific discoveries by bringing participants into the laboratory, providing the participants with the data that a scientist had access to, and getting the participants to reason about the data and rediscover a scientific concept. He then analyzed the verbal protocols that participants generated and mapped out the types of problem spaces that the participants search in (e.g., Qin & Simon, 1990 ). Kulkarni and Simon ( 1988 ) used a more historical approach to uncover the problem-solving heuristics that Krebs used in his discovery of the urea cycle. Kulkarni and Simon analyzed Krebs's diaries and proposed a set of problem-solving heuristics that he used in his research. They then built a computer program incorporating the heuristics and biological knowledge that Krebs had before he made his discoveries. Of particular importance are the search heuristics that the program uses, which include experimental proposal heuristics and data interpretation heuristics. A key heuristic was an unusualness heuristic that focused on unusual findings, which guided search through a space of theories and a space of experiments.

Klahr and Dunbar ( 1988 ) extended the search in a problem space approach and proposed that scientific thinking can be thought of as a search through two related spaces: an hypothesis space and an experiment space. Each problem space that a scientist uses will have its own types of representations and operators used to change the representations. Search in the hypothesis space constrains search in the experiment space. Klahr and Dunbar found that some participants move from the hypothesis space to the experiment space, whereas others move from the experiment space to the hypothesis space. These different types of searches lead to the proposal of different types of hypotheses and experiments. More recent work has extended the dual-space approach to include alternative problem-solving spaces, including those for data, instrumentation, and domain-specific knowledge (Klahr & Simon, 1999 ; Schunn & Klahr, 1995 , 1996 ).

Scientific Thinking as Hypothesis Testing

Many researchers have regarded testing specific hypotheses predicted by theories as one of the key attributes of scientific thinking. Hypothesis testing is the process of evaluating a proposition by collecting evidence regarding its truth. Experimental cognitive research on scientific thinking that specifically examines this issue has tended to fall into two broad classes of investigations. The first class is concerned with the types of reasoning that lead scientists astray, thus blocking scientific ingenuity. A large amount of research has been conducted on the potentially faulty reasoning strategies that both participants in experiments and scientists use, such as considering only one favored hypothesis at a time and how this prevents the scientists from making discoveries. The second class is concerned with uncovering the mental processes underlying the generation of new scientific hypotheses and concepts. This research has tended to focus on the use of analogy and imagery in science, as well as the use of specific types of problem-solving heuristics.

Turning first to investigations of what diminishes scientific creativity, philosophers, historians, and experimental psychologists have devoted a considerable amount of research to “confirmation bias.” This occurs when scientists only consider one hypothesis (typically the favored hypothesis) and ignore other alternative hypotheses or potentially relevant hypotheses. This important phenomenon can distort the design of experiments, formulation of theories, and interpretation of data. Beginning with the work of Wason ( 1968 ) and as discussed earlier, researchers have repeatedly shown that when participants are asked to design an experiment to test a hypothesis they will predominantly design experiments that they think will yield results consistent with the hypothesis. Using the 2-4-6 task mentioned earlier, Klayman and Ha ( 1987 ) showed that in situations where one's hypothesis is likely to be confirmed, seeking confirmation is a normatively incorrect strategy, whereas when the probability of confirming one's hypothesis is low, then attempting to confirm one's hypothesis can be an appropriate strategy. Historical analyses by Tweney ( 1989 ), concerning the way that Faraday made his discoveries, and experiments investigating people testing hypotheses, have revealed that people use a confirm early, disconfirm late strategy: When people initially generate or are given hypotheses, they try and gather evidence that is consistent with the hypothesis. Once enough evidence has been gathered, then people attempt to find the boundaries of their hypothesis and often try to disconfirm their hypotheses.

In an interesting variant on the confirmation bias paradigm, Gorman ( 1989 ) showed that when participants are told that there is the possibility of error in the data that they receive, participants assume that any data that are inconsistent with their favored hypothesis are due to error. Thus, the possibility of error “insulates” hypotheses against disconfirmation. This intriguing hypothesis has not been confirmed by other researchers (Penner & Klahr, 1996 ), but it is an intriguing hypothesis that warrants further investigation.

Confirmation bias is very difficult to overcome. Even when participants are asked to consider alternate hypotheses, they will often fail to conduct experiments that could potentially disconfirm their hypothesis. Tweney and his colleagues provide an excellent overview of this phenomenon in their classic monograph On Scientific Thinking (1981). The precise reasons for this type of block are still widely debated. Researchers such as Michael Doherty have argued that working memory limitations make it difficult for people to consider more than one hypothesis. Consistent with this view, Dunbar and Sussman ( 1995 ) have shown that when participants are asked to hold irrelevant items in working memory while testing hypotheses, the participants will be unable to switch hypotheses in the face of inconsistent evidence. While working memory limitations are involved in the phenomenon of confirmation bias, even groups of scientists can also display confirmation bias. For example, the controversy over cold fusion is an example of confirmation bias. Here, large groups of scientists had other hypotheses available to explain their data yet maintained their hypotheses in the face of other more standard alternative hypotheses. Mitroff ( 1974 ) provides some interesting examples of NASA scientists demonstrating confirmation bias, which highlight the roles of commitment and motivation in this process. See also MacPherson and Stanovich ( 2007 ) for specific strategies that can be used to overcome confirmation bias.

Causal Thinking in Science

Much of scientific thinking and scientific theory building pertains to the development of causal models between variables of interest. For example, do vaccines cause illnesses? Do carbon dioxide emissions cause global warming? Does water on a planet indicate that there is life on the planet? Scientists and nonscientists alike are constantly bombarded with statements regarding the causal relationship between such variables. How does one evaluate the status of such claims? What kinds of data are informative? How do scientists and nonscientists deal with data that are inconsistent with their theory?

A central issue in the causal reasoning literature, one that is directly relevant to scientific thinking, is the extent to which scientists and nonscientists alike are governed by the search for causal mechanisms (i.e., how a variable works) versus the search for statistical data (i.e., how often variables co-occur). This dichotomy can be boiled down to the search for qualitative versus quantitative information about the paradigm the scientist is investigating. Researchers from a number of cognitive psychology laboratories have found that people prefer to gather more information about an underlying mechanism than covariation between a cause and an effect (e.g., Ahn, Kalish, Medin, & Gelman, 1995 ). That is, the predominant strategy that students in simulations of scientific thinking use is to gather as much information as possible about how the objects under investigation work, rather than collecting large amounts of quantitative data to determine whether the observations hold across multiple samples. These findings suggest that a central component of scientific thinking may be to formulate explicit mechanistic causal models of scientific events.

One type of situation in which causal reasoning has been observed extensively is when scientists obtain unexpected findings. Both historical and naturalistic research has revealed that reasoning causally about unexpected findings plays a central role in science. Indeed, scientists themselves frequently state that a finding was due to chance or was unexpected. Given that claims of unexpected findings are such a frequent component of scientists' autobiographies and interviews in the media, Dunbar ( 1995 , 1997 , 1999 ; Dunbar & Fugelsang, 2005 ; Fugelsang, Stein, Green, & Dunbar, 2004 ) decided to investigate the ways that scientists deal with unexpected findings. In 1991–1992 Dunbar spent 1 year in three molecular biology laboratories and one immunology laboratory at a prestigious U.S. university. He used the weekly laboratory meeting as a source of data on scientific discovery and scientific reasoning. (He termed this type of study “in vivo” cognition.) When he looked at the types of findings that the scientists made, he found that over 50% of the findings were unexpected and that these scientists had evolved a number of effective strategies for dealing with such findings. One clear strategy was to reason causally about the findings: Scientists attempted to build causal models of their unexpected findings. This causal model building results in the extensive use of collaborative reasoning, analogical reasoning, and problem-solving heuristics (Dunbar, 1997 , 2001 ).

Many of the key unexpected findings that scientists reasoned about in the in vivo studies of scientific thinking were inconsistent with the scientists' preexisting causal models. A laboratory equivalent of the biology labs involved creating a situation in which students obtained unexpected findings that were inconsistent with their preexisting theories. Dunbar and Fugelsang ( 2005 ) examined this issue by creating a scientific causal thinking simulation where experimental outcomes were either expected or unexpected. Dunbar ( 1995 ) has called the study of people reasoning in a cognitive laboratory “in vitro” cognition. These investigators found that students spent considerably more time reasoning about unexpected findings than expected findings. In addition, when assessing the overall degree to which their hypothesis was supported or refuted, participants spent the majority of their time considering unexpected findings. An analysis of participants' verbal protocols indicates that much of this extra time was spent formulating causal models for the unexpected findings. Similarly, scientists spend more time considering unexpected than expected findings, and this time is devoted to building causal models (Dunbar & Fugelsang, 2004 ).

Scientists know that unexpected findings occur often, and they have developed many strategies to take advantage of their unexpected findings. One of the most important places that they anticipate the unexpected is in designing experiments (Baker & Dunbar, 2000 ). They build different causal models of their experiments incorporating many conditions and controls. These multiple conditions and controls allow unknown mechanisms to manifest themselves. Thus, rather than being the victims of the unexpected, they create opportunities for unexpected events to occur, and once these events do occur, they have causal models that allow them to determine exactly where in the causal chain their unexpected finding arose. The results of these in vivo and in vitro studies all point to a more complex and nuanced account of how scientists and nonscientists alike test and evaluate hypotheses about theories.

The Roles of Inductive, Abductive, and Deductive Thinking in Science

One of the most basic characteristics of science is that scientists assume that the universe that we live in follows predictable rules. Scientists reason using a variety of different strategies to make new scientific discoveries. Three frequently used types of reasoning strategies that scientists use are inductive, abductive, and deductive reasoning. In the case of inductive reasoning, a scientist may observe a series of events and try to discover a rule that governs the event. Once a rule is discovered, scientists can extrapolate from the rule to formulate theories of observed and yet-to-be-observed phenomena. One example is the discovery using inductive reasoning that a certain type of bacterium is a cause of many ulcers (Thagard, 1999 ). In a fascinating series of articles, Thagard documented the reasoning processes that Marshall and Warren went through in proposing this novel hypothesis. One key reasoning process was the use of induction by generalization. Marshall and Warren noted that almost all patients with gastric entritis had a spiral bacterium in their stomachs, and he formed the generalization that this bacterium is the cause of stomach ulcers. There are numerous other examples of induction by generalization in science, such as Tycho De Brea's induction about the motion of planets from his observations, Dalton's use of induction in chemistry, and the discovery of prions as the source of mad cow disease. Many theories of induction have used scientific discovery and reasoning as examples of this important reasoning process.

Another common type of inductive reasoning is to map a feature of one member of a category to another member of a category. This is called categorical induction. This type of induction is a way of projecting a known property of one item onto another item that is from the same category. Thus, knowing that the Rous Sarcoma virus is a retrovirus that uses RNA rather than DNA, a biologist might assume that another virus that is thought to be a retrovirus also uses RNA rather than DNA. While research on this type of induction typically has not been discussed in accounts of scientific thinking, this type of induction is common in science. For an influential contribution to this literature, see Smith, Shafir, and Osherson ( 1993 ), and for reviews of this literature see Heit ( 2000 ) and Medin et al. (Chapter 11 ).

While less commonly mentioned than inductive reasoning, abductive reasoning is an important form of reasoning that scientists use when they are seeking to propose explanations for events such as unexpected findings (see Lombrozo, Chapter 14 ; Magnani, et al., 2010 ). In Figure 35.1 , taken from King ( 2011 ), the differences between inductive, abductive, and deductive thinking are highlighted. In the case of abduction, the reasoner attempts to generate explanations of the form “if situation X had occurred, could it have produced the current evidence I am attempting to interpret?” (For an interesting of analysis of abductive reasoning see the brief paper by Klahr & Masnick, 2001 ). Of course, as in classical induction, such reasoning may produce a plausible account that is still not the correct one. However, abduction does involve the generation of new knowledge, and is thus also related to research on creativity.

The different processes underlying inductive, abductive, and deductive reasoning in science. (Figure reproduced from King 2011 ).)

Turning now to deductive thinking, many thinking processes that scientists adhere to follow traditional rules of deductive logic. These processes correspond to those conditions in which a hypothesis may lead to, or is deducible to, a conclusion. Though they are not always phrased in syllogistic form, deductive arguments can be phrased as “syllogisms,” or as brief, mathematical statements in which the premises lead to the conclusion. Deductive reasoning is an extremely important aspect of scientific thinking because it underlies a large component of how scientists conduct their research. By looking at many scientific discoveries, we can often see that deductive reasoning is at work. Deductive reasoning statements all contain information or rules that state an assumption about how the world works, as well as a conclusion that would necessarily follow from the rule. Numerous discoveries in physics such as the discovery of dark matter by Vera Rubin are based on deductions. In the dark matter case, Rubin measured galactic rotation curves and based on the differences between the predicted and observed angular motions of galaxies she deduced that the structure of the universe was uneven. This led her to propose that dark matter existed. In contemporary physics the CERN Large Hadron Collider is being used to search for the Higgs Boson. The Higgs Boson is a deductive prediction from contemporary physics. If the Higgs Boson is not found, it may lead to a radical revision of the nature of physics and a new understanding of mass (Hecht, 2011 ).

The Roles of Analogy in Scientific Thinking

One of the most widely mentioned reasoning processes used in science is analogy. Scientists use analogies to form a bridge between what they already know and what they are trying to explain, understand, or discover. In fact, many scientists have claimed that the making of certain analogies was instrumental in their making a scientific discovery, and almost all scientific autobiographies and biographies feature one particular analogy that is discussed in depth. Coupled with the fact that there has been an enormous research program on analogical thinking and reasoning (see Holyoak, Chapter 13 ), we now have a number of models and theories of analogical reasoning that suggest how analogy can play a role in scientific discovery (see Gentner, Holyoak, & Kokinov, 2001 ). By analyzing several major discoveries in the history of science, Thagard and Croft ( 1999 ), Nersessian ( 1999 , 2008 ), and Gentner and Jeziorski ( 1993 ) have all shown that analogical reasoning is a key aspect of scientific discovery.

Traditional accounts of analogy distinguish between two components of analogical reasoning: the target and the source (Holyoak, Chapter 13 ; Gentner 2010 ). The target is the concept or problem that a scientist is attempting to explain or solve. The source is another piece of knowledge that the scientist uses to understand the target or to explain the target to others. What the scientist does when he or she makes an analogy is to map features of the source onto features of the target. By mapping the features of the source onto the target, new features of the target may be discovered, or the features of the target may be rearranged so that a new concept is invented and a scientific discovery is made. For example, a common analogy that is used with computers is to describe a harmful piece of software as a computer virus. Once a piece of software is called a virus, people can map features of biological viruses, such as that it is small, spreads easily, self-replicates using a host, and causes damage. People not only map individual features of the source onto the target but also the systems of relations. For example, if a computer virus is similar to a biological virus, then an immune system can be created on computers that can protect computers from future variants of a virus. One of the reasons that scientific analogy is so powerful is that it can generate new knowledge, such as the creation of a computational immune system having many of the features of a real biological immune system. This analogy also leads to predictions that there will be newer computer viruses that are the computational equivalent of retroviruses, lacking DNA, or standard instructions, that will elude the computational immune system.

The process of making an analogy involves a number of key steps: retrieval of a source from memory, aligning the features of the source with those of the target, mapping features of the source onto those of the target, and possibly making new inferences about the target. Scientific discoveries are made when the source highlights a hitherto unknown feature of the target or restructures the target into a new set of relations. Interestingly, research on analogy has shown that participants do not easily use remote analogies (see Gentner et al., 1997 ; Holyoak & Thagard 1995 ). Participants in experiments tend to focus on the sharing of a superficial feature between the source and the target, rather than the relations among features. In his in vivo studies of science, Dunbar ( 1995 , 2001 , 2002 ) investigated the ways that scientists use analogies while they are conducting their research and found that scientists use both relational and superficial features when they make analogies. Whether they use superficial or relational features depends on their goals. If their goal is to fix a problem in an experiment, their analogies are based upon superficial features. However, if their goal is to formulate hypotheses, they focus on analogies based upon sets of relations. One important difference between scientists and participants in experiments is that the scientists have deep relational knowledge of the processes that they are investigating and can hence use this relational knowledge to make analogies (see Holyoak, Chapter 13 for a thorough review of analogical reasoning).

Are scientific analogies always useful? Sometimes analogies can lead scientists and students astray. For example, Evelyn Fox-Keller ( 1985 ) shows how an analogy between the pulsing of a lighthouse and the activity of the slime mold dictyostelium led researchers astray for a number of years. Likewise, the analogy between the solar system (the source) and the structure of the atom (the target) has been shown to be potentially misleading to students taking more advanced courses in physics or chemistry. The solar system analogy has a number of misalignments to the structure of the atom, such as electrons being repelled from each other rather than attracted; moreover, electrons do not have individual orbits like planets but have orbit clouds of electron density. Furthermore, students have serious misconceptions about the nature of the solar system, which can compound their misunderstanding of the nature of the atom (Fischler & Lichtfeld, 1992 ). While analogy is a powerful tool in science, like all forms of induction, incorrect conclusions can be reached.

Conceptual Change in Science

Scientific knowledge continually accumulates as scientists gather evidence about the natural world. Over extended time, this knowledge accumulation leads to major revisions, extensions, and new organizational forms for expressing what is known about nature. Indeed, these changes are so substantial that philosophers of science speak of “revolutions” in a variety of scientific domains (Kuhn, 1962 ). The psychological literature that explores the idea of revolutionary conceptual change can be roughly divided into (a) investigations of how scientists actually make discoveries and integrate those discoveries into existing scientific contexts, and (b) investigations of nonscientists ranging from infants, to children, to students in science classes. In this section we summarize the adult studies of conceptual change, and in the next section we look at its developmental aspects.

Scientific concepts, like all concepts, can be characterized as containing a variety of “knowledge elements”: representations of words, thoughts, actions, objects, and processes. At certain points in the history of science, the accumulated evidence has demanded major shifts in the way these collections of knowledge elements are organized. This “radical conceptual change” process (see Keil, 1999 ; Nersessian 1998 , 2002 ; Thagard, 1992 ; Vosniadou 1998, for reviews) requires the formation of a new conceptual system that organizes knowledge in new ways, adds new knowledge, and results in a very different conceptual structure. For more recent research on conceptual change, The International Handbook of Research on Conceptual Change (Vosniadou, 2008 ) provides a detailed compendium of theories and controversies within the field.

While conceptual change in science is usually characterized by large-scale changes in concepts that occur over extensive periods of time, it has been possible to observe conceptual change using in vivo methodologies. Dunbar ( 1995 ) reported a major conceptual shift that occurred in immunologists, where they obtained a series of unexpected findings that forced the scientists to propose a new concept in immunology that in turn forced the change in other concepts. The drive behind this conceptual change was the discovery of a series of different unexpected findings or anomalies that required the scientists to both revise and reorganize their conceptual knowledge. Interestingly, this conceptual change was achieved by a group of scientists reasoning collaboratively, rather than by a scientist working alone. Different scientists tend to work on different aspects of concepts, and also different concepts, that when put together lead to a rapid change in entire conceptual structures.

Overall, accounts of conceptual change in individuals indicate that it is indeed similar to that of conceptual change in entire scientific fields. Individuals need to be confronted with anomalies that their preexisting theories cannot explain before entire conceptual structures are overthrown. However, replacement conceptual structures have to be generated before the old conceptual structure can be discarded. Sometimes, people do not overthrow their original conceptual theories and through their lives maintain their original views of many fundamental scientific concepts. Whether people actively possess naive theories, or whether they appear to have a naive theory because of the demand characteristics of the testing context, is a lively source of debate within the science education community (see Gupta, Hammer, & Redish, 2010 ).

Scientific Thinking in Children

Well before their first birthday, children appear to know several fundamental facts about the physical world. For example, studies with infants show that they behave as if they understand that solid objects endure over time (e.g., they don't just disappear and reappear, they cannot move through each other, and they move as a result of collisions with other solid objects or the force of gravity (Baillargeon, 2004 ; Carey 1985 ; Cohen & Cashon, 2006 ; Duschl, Schweingruber, & Shouse, 2007 ; Gelman & Baillargeon, 1983 ; Gelman & Kalish, 2006 ; Mandler, 2004 ; Metz 1995 ; Munakata, Casey, & Diamond, 2004 ). And even 6-month-olds are able to predict the future location of a moving object that they are attempting to grasp (Von Hofsten, 1980 ; Von Hofsten, Feng, & Spelke, 2000 ). In addition, they appear to be able to make nontrivial inferences about causes and their effects (Gopnik et al., 2004 ).

The similarities between children's thinking and scientists' thinking have an inherent allure and an internal contradiction. The allure resides in the enthusiastic wonder and openness with which both children and scientists approach the world around them. The paradox comes from the fact that different investigators of children's thinking have reached diametrically opposing conclusions about just how “scientific” children's thinking really is. Some claim support for the “child as a scientist” position (Brewer & Samarapungavan, 1991 ; Gelman & Wellman, 1991 ; Gopnik, Meltzoff, & Kuhl, 1999 ; Karmiloff-Smith 1988 ; Sodian, Zaitchik, & Carey, 1991 ; Samarapungavan 1992 ), while others offer serious challenges to the view (Fay & Klahr, 1996 ; Kern, Mirels, & Hinshaw, 1983 ; Kuhn, Amsel, & O'Laughlin, 1988 ; Schauble & Glaser, 1990 ; Siegler & Liebert, 1975 .) Such fundamentally incommensurate conclusions suggest that this very field—children's scientific thinking—is ripe for a conceptual revolution!

A recent comprehensive review (Duschl, Schweingruber, & Shouse, 2007 ) of what children bring to their science classes offers the following concise summary of the extensive developmental and educational research literature on children's scientific thinking:

Children entering school already have substantial knowledge of the natural world, much of which is implicit.

What children are capable of at a particular age is the result of a complex interplay among maturation, experience, and instruction. What is developmentally appropriate is not a simple function of age or grade, but rather is largely contingent on children's prior opportunities to learn.

Students' knowledge and experience play a critical role in their science learning, influencing four aspects of science understanding, including (a) knowing, using, and interpreting scientific explanations of the natural world; (b) generating and evaluating scientific evidence and explanations, (c) understanding how scientific knowledge is developed in the scientific community, and (d) participating in scientific practices and discourse.

Students learn science by actively engaging in the practices of science.

In the previous section of this article we discussed conceptual change with respect to scientific fields and undergraduate science students. However, the idea that children undergo radical conceptual change in which old “theories” need to be overthrown and reorganized has been a central topic in understanding changes in scientific thinking in both children and across the life span. This radical conceptual change is thought to be necessary for acquiring many new concepts in physics and is regarded as the major source of difficulty for students. The factors that are at the root of this conceptual shift view have been difficult to determine, although there have been a number of studies in cognitive development (Carey, 1985 ; Chi 1992 ; Chi & Roscoe, 2002 ), in the history of science (Thagard, 1992 ), and in physics education (Clement, 1982 ; Mestre 1991 ) that give detailed accounts of the changes in knowledge representation that occur while people switch from one way of representing scientific knowledge to another.

One area where students show great difficulty in understanding scientific concepts is physics. Analyses of students' changing conceptions, using interviews, verbal protocols, and behavioral outcome measures, indicate that large-scale changes in students' concepts occur in physics education (see McDermott & Redish, 1999 , for a review of this literature). Following Kuhn ( 1962 ), many researchers, but not all, have noted that students' changing conceptions resemble the sequences of conceptual changes in physics that have occurred in the history of science. These notions of radical paradigm shifts and ensuing incompatibility with past knowledge-states have called attention to interesting parallels between the development of particular scientific concepts in children and in the history of physics. Investigations of nonphysicists' understanding of motion indicate that students have extensive misunderstandings of motion. Some researchers have interpreted these findings as an indication that many people hold erroneous beliefs about motion similar to a medieval “impetus” theory (McCloskey, Caramazza, & Green, 1980 ). Furthermore, students appear to maintain “impetus” notions even after one or two courses in physics. In fact, some authors have noted that students who have taken one or two courses in physics can perform worse on physics problems than naive students (Mestre, 1991 ). Thus, it is only after extensive learning that we see a conceptual shift from impetus theories of motion to Newtonian scientific theories.

How one's conceptual representation shifts from “naive” to Newtonian is a matter of contention, as some have argued that the shift involves a radical conceptual change, whereas others have argued that the conceptual change is not really complete. For example, Kozhevnikov and Hegarty ( 2001 ) argue that much of the naive impetus notions of motion are maintained at the expense of Newtonian principles even with extensive training in physics. However, they argue that such impetus principles are maintained at an implicit level. Thus, although students can give the correct Newtonian answer to problems, their reaction times to respond indicate that they are also using impetus theories when they respond. An alternative view of conceptual change focuses on whether there are real conceptual changes at all. Gupta, Hammer and Redish ( 2010 ) and Disessa ( 2004 ) have conducted detailed investigations of changes in physics students' accounts of phenomena covered in elementary physics courses. They have found that rather than students possessing a naive theory that is replaced by the standard theory, many introductory physics students have no stable physical theory but rather construct their explanations from elementary pieces of knowledge of the physical world.

Computational Approaches to Scientific Thinking

Computational approaches have provided a more complete account of the scientific mind. Computational models provide specific detailed accounts of the cognitive processes underlying scientific thinking. Early computational work consisted of taking a scientific discovery and building computational models of the reasoning processes involved in the discovery. Langley, Simon, Bradshaw, and Zytkow ( 1987 ) built a series of programs that simulated discoveries such as those of Copernicus, Bacon, and Stahl. These programs had various inductive reasoning algorithms built into them, and when given the data that the scientists used, they were able to propose the same rules. Computational models make it possible to propose detailed models of the cognitive subcomponents of scientific thinking that specify exactly how scientific theories are generated, tested, and amended (see Darden, 1997 , and Shrager & Langley, 1990 , for accounts of this branch of research). More recently, the incorporation of scientific knowledge into computer programs has resulted in a shift in emphasis from using programs to simulate discoveries to building programs that are used to help scientists make discoveries. A number of these computer programs have made novel discoveries. For example, Valdes-Perez ( 1994 ) has built systems for discoveries in chemistry, and Fajtlowicz has done this in mathematics (Erdos, Fajtlowicz, & Staton, 1991 ).

These advances in the fields of computer discovery have led to new fields, conferences, journals, and even departments that specialize in the development of programs devised to search large databases in the hope of making new scientific discoveries (Langley, 2000 , 2002 ). This process is commonly known as “data mining.” This approach has only proved viable relatively recently, due to advances in computer technology. Biswal et al. ( 2010 ), Mitchell ( 2009 ), and Yang ( 2009 ) provide recent reviews of data mining in different scientific fields. Data mining is at the core of drug discovery, our understanding of the human genome, and our understanding of the universe for a number of reasons. First, vast databases concerning drug actions, biological processes, the genome, the proteome, and the universe itself now exist. Second, the development of high throughput data-mining algorithms makes it possible to search for new drug targets, novel biological mechanisms, and new astronomical phenomena in relatively short periods of time. Research programs that took decades, such as the development of penicillin, can now be done in days (Yang, 2009 ).

Another recent shift in the use of computers in scientific discovery has been to have both computers and people make discoveries together, rather than expecting that computers make an entire scientific discovery. Now instead of using computers to mimic the entire scientific discovery process as used by humans, computers can use powerful algorithms that search for patterns on large databases and provide the patterns to humans who can then use the output of these computers to make discoveries, ranging from the human genome to the structure of the universe. However, there are some robots such as ADAM, developed by King ( 2011 ), that can actually perform the entire scientific process, from the generation of hypotheses, to the conduct of experiments and the interpretation of results, with little human intervention. The ongoing development of scientific robots by some scientists (King et al., 2009 ) thus continues the tradition started by Herbert Simon in the 1960s. However, many of the controversies as to whether the robot is a “real scientist” or not continue to the present (Evans & Rzhetsky, 2010 , Gianfelici, 2010 ; Haufe, Elliott, Burian, & O' Malley, 2010 ; O'Malley 2011 ).

Scientific Thinking and Science Education

Accounts of the nature of science and research on scientific thinking have had profound effects on science education along many levels, particularly in recent years. Science education from the 1900s until the 1970s was primarily concerned with teaching students both the content of science (such as Newton's laws of motion) or the methods that scientists need to use in their research (such as using experimental and control groups). Beginning in the 1980s, a number of reports (e.g., American Association for the Advancement of Science, 1993; National Commission on Excellence in Education, 1983; Rutherford & Ahlgren, 1991 ) stressed the need for teaching scientific thinking skills rather than just methods and content. The addition of scientific thinking skills to the science curriculum from kindergarten through adulthood was a major shift in focus. Many of the particular scientific thinking skills that have been emphasized are skills covered in previous sections of this chapter, such as teaching deductive and inductive thinking strategies. However, rather than focusing on one particular skill, such as induction, researchers in education have focused on how the different components of scientific thinking are put together in science. Furthermore, science educators have focused upon situations where science is conducted collaboratively, rather than being the product of one person thinking alone. These changes in science education parallel changes in methodologies used to investigate science, such as analyzing the ways that scientists think and reason in their laboratories.

By looking at science as a complex multilayered and group activity, many researchers in science education have adopted a constructivist approach. This approach sees learning as an active rather than a passive process, and it suggests that students learn through constructing their scientific knowledge. We will first describe a few examples of the constructivist approach to science education. Following that, we will address several lines of work that challenge some of the assumptions of the constructivist approach to science education.

Often the goal of constructivist science education is to produce conceptual change through guided instruction where the teacher or professor acts as a guide to discovery, rather than the keeper of all the facts. One recent and influential approach to science education is the inquiry-based learning approach. Inquiry-based learning focuses on posing a problem or a puzzling event to students and asking them to propose a hypothesis that could explain the event. Next, the student is asked to collect data that test the hypothesis, make conclusions, and then reflect upon both the original problem and the thought processes that they used to solve the problem. Often students use computers that aid in their construction of new knowledge. The computers allow students to learn many of the different components of scientific thinking. For example, Reiser and his colleagues have developed a learning environment for biology, where students are encouraged to develop hypotheses in groups, codify the hypotheses, and search databases to test these hypotheses (Reiser et al., 2001 ).

One of the myths of science is the lone scientist suddenly shouting “Eureka, I have made a discovery!” Instead, in vivo studies of scientists (e.g., Dunbar, 1995 , 2002 ), historical analyses of scientific discoveries (Nersessian, 1999 ), and studies of children learning science at museums have all pointed to collaborative scientific discovery mechanisms as being one of the driving forces of science (Atkins et al., 2009 ; Azmitia & Crowley, 2001 ). What happens during collaborative scientific thinking is that there is usually a triggering event, such as an unexpected result or situation that a student does not understand. This results in other members of the group adding new information to the person's representation of knowledge, often adding new inductions and deductions that both challenge and transform the reasoner's old representations of knowledge (Chi & Roscoe, 2002 ; Dunbar 1998 ). Social mechanisms play a key component in fostering changes in concepts that have been ignored in traditional cognitive research but are crucial for both science and science education. In science education there has been a shift to collaborative learning, particularly at the elementary level; however, in university education, the emphasis is still on the individual scientist. As many domains of science now involve collaborations across scientific disciplines, we expect the explicit teaching of heuristics for collaborative science to increase.

What is the best way to teach and learn science? Surprisingly, the answer to this question has been difficult to uncover. For example, toward the end of the last century, influenced by several thinkers who advocated a constructivist approach to learning, ranging from Piaget (Beilin, 1994 ) to Papert ( 1980 ), many schools answered this question by adopting a philosophy dubbed “discovery learning.” Although a clear operational definition of this approach has yet to be articulated, the general idea is that children are expected to learn science by reconstructing the processes of scientific discovery—in a range of areas from computer programming to chemistry to mathematics. The premise is that letting students discover principles on their own, set their own goals, and collaboratively explore the natural world produces deeper knowledge that transfers widely.

The research literature on science education is far from consistent in its use of terminology. However, our reading suggests that “discovery learning” differs from “inquiry-based learning” in that few, if any, guidelines are given to students in discovery learning contexts, whereas in inquiry learning, students are given hypotheses and specific goals to achieve (see the second paragraph of this section for a definition of inquiry-based learning). Even though thousands of schools have adopted discovery learning as an alternative to more didactic approaches to teaching and learning, the evidence showing that it is more effective than traditional, direct, teacher-controlled instructional approaches is mixed, at best (Lorch et al., 2010 ; Minner, Levy, & Century, 2010 ). In several cases where the distinctions between direct instruction and more open-ended constructivist instruction have been clearly articulated, implemented, and assessed, direct instruction has proven to be superior to the alternatives (Chen & Klahr, 1999 ; Toth, Klahr, & Chen, 2000 ). For example, in a study of third- and fourth-grade children learning about experimental design, Klahr and Nigam ( 2004 ) found that many more children learned from direct instruction than from discovery learning. Furthermore, they found that among the few children who did manage to learn from a discovery method, there was no better performance on a far transfer test of scientific reasoning than that observed for the many children who learned from direct instruction.

The idea of children learning most of their science through a process of self-directed discovery has some romantic appeal, and it may accurately describe the personal experience of a handful of world-class scientists. However, the claim has generated some contentious disagreements (Kirschner, Sweller, & Clark, 2006 ; Klahr, 2010 ; Taber 2009 ; Tobias & Duffy, 2009 ), and the jury remains out on the extent to which most children can learn science that way.

Conclusions and Future Directions

The field of scientific thinking is now a thriving area of research with strong underpinnings in cognitive psychology and cognitive science. In recent years, a new professional society has been formed that aims to facilitate this integrative and interdisciplinary approach to the psychology of science, with its own journal and regular professional meetings. 1 Clearly the relations between these different aspects of scientific thinking need to be combined in order to produce a truly comprehensive picture of the scientific mind.

While much is known about certain aspects of scientific thinking, much more remains to be discovered. In particular, there has been little contact between cognitive, neuroscience, social, personality, and motivational accounts of scientific thinking. Research in thinking and reasoning has been expanded to use the methods and theories of cognitive neuroscience (see Morrison & Knowlton, Chapter 6 ). A similar approach can be taken in exploring scientific thinking (see Dunbar et al., 2007 ). There are two main reasons for taking a neuroscience approach to scientific thinking. First, functional neuroimaging allows the researcher to look at the entire human brain, making it possible to see the many different sites that are involved in scientific thinking and gain a more complete understanding of the entire range of mechanisms involved in this type of thought. Second, these brain-imaging approaches allow researchers to address fundamental questions in research on scientific thinking, such as the extent to which ordinary thinking in nonscientific contexts and scientific thinking recruit similar versus disparate neural structures of the brain.

Dunbar ( 2009 ) has used some novel methods to explore Simon's assertion, cited at the beginning of this chapter, that scientific thinking uses the same cognitive mechanisms that all human beings possess (rather than being an entirely different type of thinking) but combines them in ways that are specific to a particular aspect of science or a specific discipline of science. For example, Fugelsang and Dunbar ( 2009 ) compared causal reasoning when two colliding circular objects were labeled balls or labeled subatomic particles. They obtained different brain activation patterns depending on whether the stimuli were labeled balls or subatomic particles. In another series of experiments, Dunbar and colleagues used functional magnetic resonance imaging (fMRI) to study patterns of activation in the brains of students who have and who have not undergone conceptual change in physics. For example, Fugelsang and Dunbar ( 2005 ) and Dunbar et al. ( 2007 ) have found differences in the activation of specific brain sites (such as the anterior cingulate) for students when they encounter evidence that is inconsistent with their current conceptual understandings. These initial cognitive neuroscience investigations have the potential to reveal the ways that knowledge is organized in the scientific brain and provide detailed accounts of the nature of the representation of scientific knowledge. Petitto and Dunbar ( 2004 ) proposed the term “educational neuroscience” for the integration of research on education, including science education, with research on neuroscience. However, see Fitzpatrick (in press) for a very different perspective on whether neuroscience approaches are relevant to education. Clearly, research on the scientific brain is just beginning. We as scientists are beginning to get a reasonable grasp of the inner workings of the subcomponents of the scientific mind (i.e., problem solving, analogy, induction). However, great advances remain to be made concerning how these processes interact so that scientific discoveries can be made. Future research will focus on both the collaborative aspects of scientific thinking and the neural underpinnings of the scientific mind.

The International Society for the Psychology of Science and Technology (ISPST). Available at http://www.ispstonline.org/

Ahn, W., Kalish, C. W., Medin, D. L., & Gelman, S. A. ( 1995 ). The role of covariation versus mechanism information in causal attribution.   Cognition , 54 , 299–352.

American Association for the Advancement of Science. ( 1993 ). Benchmarks for scientific literacy . New York: Oxford University Press.

Google Scholar

Google Preview

Atkins, L. J., Velez, L., Goudy, D., & Dunbar, K. N. ( 2009 ). The unintended effects of interactive objects and labels in the science museum.   Science Education , 54 , 161–184.

Azmitia, M. A., & Crowley, K. ( 2001 ). The rhythms of scientific thinking: A study of collaboration in an earthquake microworld. In K. Crowley, C. Schunn, & T. Okada (Eds.), Designing for science: Implications from everyday, classroom, and professional settings (pp. 45–72). Mahwah, NJ: Erlbaum.

Bacon, F. ( 1620 /1854). Novum organum (B. Monatgue, Trans.). Philadelphia, P A: Parry & McMillan.

Baillargeon, R. ( 2004 ). Infants' reasoning about hidden objects: Evidence for event-general and event-specific expectations (article with peer commentaries and response, listed below).   Developmental Science , 54 , 391–424.

Baker, L. M., & Dunbar, K. ( 2000 ). Experimental design heuristics for scientific discovery: The use of baseline and known controls.   International Journal of Human Computer Studies , 54 , 335–349.

Beilin, H. ( 1994 ). Jean Piaget's enduring contribution to developmental psychology. In R. D. Parke, P. A. Ornstein, J. J. Rieser, & C. Zahn-Waxler (Eds.), A century of developmental psychology (pp. 257–290). Washington, DC US: American Psychological Association.

Biswal, B. B., Mennes, M., Zuo, X.-N., Gohel, S., Kelly, C., Smith, S.M., et al. ( 2010 ). Toward discovery science of human brain function.   Proceedings of the National Academy of Sciences of the United States of America , 107, 4734–4739.

Brewer, W. F., & Samarapungavan, A. ( 1991 ). Children's theories vs. scientific theories: Differences in reasoning or differences in knowledge? In R. R. Hoffman & D. S. Palermo (Eds.), Cognition and the symbolic processes: Applied and ecological perspectives (pp. 209–232). Hillsdale, NJ: Erlbaum.

Bruner, J. S., Goodnow, J. J., & Austin, G. A. ( 1956 ). A study of thinking . New York: NY Science Editions.

Carey, S. ( 1985 ). Conceptual change in childhood . Cambridge, MA: MIT Press.

Carruthers, P., Stich, S., & Siegal, M. ( 2002 ). The cognitive basis of science . New York: Cambridge University Press.

Chi, M. ( 1992 ). Conceptual change within and across ontological categories: Examples from learning and discovery in science. In R. Giere (Ed.), Cognitive models of science (pp. 129–186). Minneapolis: University of Minnesota Press.

Chi, M. T. H., & Roscoe, R. D. ( 2002 ). The processes and challenges of conceptual change. In M. Limon & L. Mason (Eds.), Reconsidering conceptual change: Issues in theory and practice (pp 3–27). Amsterdam, Netherlands: Kluwer Academic Publishers.

Chen, Z., & Klahr, D. ( 1999 ). All other things being equal: Children's acquisition of the control of variables strategy.   Child Development , 54 (5), 1098–1120.

Clement, J. ( 1982 ). Students' preconceptions in introductory mechanics.   American Journal of Physics , 54 , 66–71.

Cohen, L. B., & Cashon, C. H. ( 2006 ). Infant cognition. In W. Damon & R. M. Lerner (Series Eds.) & D. Kuhn & R. S. Siegler (Vol. Eds.), Handbook of child psychology. Vol. 2: Cognition, perception, and language (6th ed., pp. 214–251). New York: Wiley.

National Commission on Excellence in Education. ( 1983 ). A nation at risk: The imperative for educational reform . Washington, DC: US Department of Education.

Crick, F. H. C. ( 1988 ). What mad pursuit: A personal view of science . New York: Basic Books.

Darden, L. ( 2002 ). Strategies for discovering mechanisms: Schema instantiation, modular subassembly, forward chaining/backtracking.   Philosophy of Science , 69, S354–S365.

Davenport, J. L., Yaron, D., Klahr, D., & Koedinger, K. ( 2008 ). Development of conceptual understanding and problem solving expertise in chemistry. In B. C. Love, K. McRae, & V. M. Sloutsky (Eds.), Proceedings of the 30th Annual Conference of the Cognitive Science Society (pp. 751–756). Austin, TX: Cognitive Science Society.

diSessa, A. A. ( 2004 ). Contextuality and coordination in conceptual change. In E. Redish & M. Vicentini (Eds.), Proceedings of the International School of Physics “Enrico Fermi:” Research on physics education (pp. 137–156). Amsterdam, Netherlands: ISO Press/Italian Physics Society

Dunbar, K. ( 1995 ). How scientists really reason: Scientific reasoning in real-world laboratories. In R. J. Sternberg, & J. Davidson (Eds.), Mechanisms of insight (pp. 365–395). Cambridge, MA: MIT press.

Dunbar, K. ( 1997 ). How scientists think: Online creativity and conceptual change in science. In T. B. Ward, S. M. Smith, & S. Vaid (Eds.), Conceptual structures and processes: Emergence, discovery and change (pp. 461–494). Washington, DC: American Psychological Association.

Dunbar, K. ( 1998 ). Problem solving. In W. Bechtel & G. Graham (Eds.), A companion to cognitive science (pp. 289–298). London: Blackwell

Dunbar, K. ( 1999 ). The scientist InVivo : How scientists think and reason in the laboratory. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 85–100). New York: Plenum.

Dunbar, K. ( 2001 ). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, K. J. Holyoak, & B. Kokinov Analogy: Perspectives from cognitive science (pp. 313–334). Cambridge, MA: MIT press.

Dunbar, K. ( 2002 ). Science as category: Implications of InVivo science for theories of cognitive development, scientific discovery, and the nature of science. In P. Caruthers, S. Stich, & M. Siegel (Eds.) Cognitive models of science (pp. 154–170). New York: Cambridge University Press.

Dunbar, K. ( 2009 ). The biology of physics: What the brain reveals about our physical understanding of the world. In M. Sabella, C. Henderson, & C. Singh. (Eds.), Proceedings of the Physics Education Research Conference (pp. 15–18). Melville, NY: American Institute of Physics.

Dunbar, K., & Fugelsang, J. ( 2004 ). Causal thinking in science: How scientists and students interpret the unexpected. In M. E. Gorman, A. Kincannon, D. Gooding, & R. D. Tweney (Eds.), New directions in scientific and technical thinking (pp. 57–59). Mahway, NJ: Erlbaum.

Dunbar, K., Fugelsang, J., & Stein, C. ( 2007 ). Do naïve theories ever go away? In M. Lovett & P. Shah (Eds.), Thinking with Data: 33 rd Carnegie Symposium on Cognition (pp. 193–206). Mahwah, NJ: Erlbaum.

Dunbar, K., & Sussman, D. ( 1995 ). Toward a cognitive account of frontal lobe function: Simulating frontal lobe deficits in normal subjects.   Annals of the New York Academy of Sciences , 54 , 289–304.

Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). ( 2007 ). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press.

Einstein, A. ( 1950 ). Out of my later years . New York: Philosophical Library

Erdos, P., Fajtlowicz, S., & Staton, W. ( 1991 ). Degree sequences in the triangle-free graphs,   Discrete Mathematics , 54 (91), 85–88.

Evans, J., & Rzhetsky, A. ( 2010 ). Machine science.   Science , 54 , 399–400.

Fay, A., & Klahr, D. ( 1996 ). Knowing about guessing and guessing about knowing: Preschoolers' understanding of indeterminacy.   Child Development , 54 , 689–716.

Fischler, H., & Lichtfeldt, M. ( 1992 ). Modern physics and students conceptions.   International Journal of Science Education , 54 , 181–190.

Fitzpatrick, S. M. (in press). Functional brain imaging: Neuro-turn or wrong turn? In M. M., Littlefield & J.M., Johnson (Eds.), The neuroscientific turn: Transdisciplinarity in the age of the brain. Ann Arbor: University of Michigan Press.

Fox-Keller, E. ( 1985 ). Reflections on gender and science . New Haven, CT: Yale University Press.

Fugelsang, J., & Dunbar, K. ( 2005 ). Brain-based mechanisms underlying complex causal thinking.   Neuropsychologia , 54 , 1204–1213.

Fugelsang, J., & Dunbar, K. ( 2009 ). Brain-based mechanisms underlying causal reasoning. In E. Kraft (Ed.), Neural correlates of thinking (pp. 269–279). Berlin, Germany: Springer

Fugelsang, J., Stein, C., Green, A., & Dunbar, K. ( 2004 ). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory.   Canadian Journal of Experimental Psychology , 54 , 132–141

Galilei, G. ( 1638 /1991). Dialogues concerning two new sciences (A. de Salvio & H. Crew, Trans.). Amherst, NY: Prometheus Books.

Galison, P. ( 2003 ). Einstein's clocks, Poincaré's maps: Empires of time . New York: W. W. Norton.

Gelman, R., & Baillargeon, R. ( 1983 ). A review of Piagetian concepts. In P. H. Mussen (Series Ed.) & J. H. Flavell & E. M. Markman (Vol. Eds.), Handbook of child psychology (4th ed., Vol. 3, pp. 167–230). New York: Wiley.

Gelman, S. A., & Kalish, C. W. ( 2006 ). Conceptual development. In D. Kuhn & R. Siegler (Eds.), Handbook of child psychology. Vol. 2: Cognition, perception and language (pp. 687–733). New York: Wiley.

Gelman, S., & Wellman, H. ( 1991 ). Insides and essences.   Cognition , 54 , 214–244.

Gentner, D. ( 2010 ). Bootstrapping the mind: Analogical processes and symbol systems.   Cognitive Science , 54 , 752–775.

Gentner, D., Brem, S., Ferguson, R. W., Markman, A. B., Levidow, B. B., Wolff, P., & Forbus, K. D. ( 1997 ). Analogical reasoning and conceptual change: A case study of Johannes Kepler.   The Journal of the Learning Sciences , 54 (1), 3–40.

Gentner, D., Holyoak, K. J., & Kokinov, B. ( 2001 ). The analogical mind: Perspectives from cognitive science . Cambridge, MA: MIT Press.

Gentner, D., & Jeziorski, M. ( 1993 ). The shift from metaphor to analogy in western science. In A. Ortony (Ed.), Metaphor and thought (2nd ed., pp. 447–480). Cambridge, England: Cambridge University Press.

Gianfelici, F. ( 2010 ). Machine science: Truly machine-aided science.   Science , 54 , 317–319.

Giere, R. ( 1993 ). Cognitive models of science . Minneapolis: University of Minnesota Press.

Gopnik, A. N., Meltzoff, A. N., & Kuhl, P. K. ( 1999 ). The scientist in the crib: Minds, brains and how children learn . New York: Harper Collins

Gorman, M. E. ( 1989 ). Error, falsification and scientific inference: An experimental investigation.   Quarterly Journal of Experimental Psychology: Human Experimental Psychology , 41A , 385–412

Gorman, M. E., Kincannon, A., Gooding, D., & Tweney, R. D. ( 2004 ). New directions in scientific and technical thinking . Mahwah, NJ: Erlbaum.

Gupta, A., Hammer, D., & Redish, E. F. ( 2010 ). The case for dynamic models of learners' ontologies in physics.   Journal of the Learning Sciences , 54 (3), 285–321.

Haufe, C., Elliott, K. C., Burian, R., & O'Malley, M. A. ( 2010 ). Machine science: What's missing.   Science , 54 , 318–320.

Hecht, E. ( 2011 ). On defining mass.   The Physics Teacher , 54 , 40–43.

Heit, E. ( 2000 ). Properties of inductive reasoning.   Psychonomic Bulletin and Review , 54 , 569–592.

Holyoak, K. J., & Thagard, P. ( 1995 ). Mental leaps . Cambridge, MA: MIT Press.

Karmiloff-Smith, A. ( 1988 ) The child is a theoretician, not an inductivist.   Mind and Language , 54 , 183–195.

Keil, F. C. ( 1999 ). Conceptual change. In R. Wilson & F. Keil (Eds.), The MIT encyclopedia of cognitive science . (pp. 179–182) Cambridge, MA: MIT press.

Kern, L. H., Mirels, H. L., & Hinshaw, V. G. ( 1983 ). Scientists' understanding of propositional logic: An experimental investigation.   Social Studies of Science , 54 , 131–146.

King, R. D. ( 2011 ). Rise of the robo scientists.   Scientific American , 54 (1), 73–77.

King, R. D., Rowland, J., Oliver, S. G., Young, M., Aubrey, W., Byrne, E., et al. ( 2009 ). The automation of science.   Science , 54 , 85–89.

Kirschner, P. A., Sweller, J., & Clark, R. ( 2006 ) Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching.   Educational Psychologist , 54 , 75–86

Klahr, D. ( 2000 ). Exploring science: The cognition and development of discovery processes . Cambridge, MA: MIT Press.

Klahr, D. ( 2010 ). Coming up for air: But is it oxygen or phlogiston? A response to Taber's review of constructivist instruction: Success or failure?   Education Review , 54 (13), 1–6.

Klahr, D., & Dunbar, K. ( 1988 ). Dual space search during scientific reasoning.   Cognitive Science , 54 , 1–48.

Klahr, D., & Nigam, M. ( 2004 ). The equivalence of learning paths in early science instruction: effects of direct instruction and discovery learning.   Psychological Science , 54 (10), 661–667.

Klahr, D. & Masnick, A. M. ( 2002 ). Explaining, but not discovering, abduction. Review of L. Magnani (2001) abduction, reason, and science: Processes of discovery and explanation.   Contemporary Psychology , 47, 740–741.

Klahr, D., & Simon, H. ( 1999 ). Studies of scientific discovery: Complementary approaches and convergent findings.   Psychological Bulletin , 54 , 524–543.

Klayman, J., & Ha, Y. ( 1987 ). Confirmation, disconfirmation, and information in hypothesis testing.   Psychological Review , 54 , 211–228.

Kozhevnikov, M., & Hegarty, M. ( 2001 ). Impetus beliefs as default heuristic: Dissociation between explicit and implicit knowledge about motion.   Psychonomic Bulletin and Review , 54 , 439–453.

Kuhn, T. ( 1962 ). The structure of scientific revolutions . Chicago, IL: University of Chicago Press.

Kuhn, D., Amsel, E., & O'Laughlin, M. ( 1988 ). The development of scientific thinking skills . Orlando, FL: Academic Press.

Kulkarni, D., & Simon, H. A. ( 1988 ). The processes of scientific discovery: The strategy of experimentation.   Cognitive Science , 54 , 139–176.

Langley, P. ( 2000 ). Computational support of scientific discovery.   International Journal of Human-Computer Studies , 54 , 393–410.

Langley, P. ( 2002 ). Lessons for the computational discovery of scientific knowledge. In Proceedings of the First International Workshop on Data Mining Lessons Learned (pp. 9–12).

Langley, P., Simon, H. A., Bradshaw, G. L., & Zytkow, J. M. ( 1987 ). Scientific discovery: Computational explorations of the creative processes . Cambridge, MA: MIT Press.

Lorch, R. F., Jr., Lorch, E. P., Calderhead, W. J., Dunlap, E. E., Hodell, E. C., & Freer, B. D. ( 2010 ). Learning the control of variables strategy in higher and lower achieving classrooms: Contributions of explicit instruction and experimentation.   Journal of Educational Psychology , 54 (1), 90–101.

Magnani, L., Carnielli, W., & Pizzi, C., (Eds.) ( 2010 ). Model-based reasoning in science and technology: Abduction, logic,and computational discovery. Series Studies in Computational Intelligence (Vol. 314). Heidelberg/Berlin: Springer.

Mandler, J.M. ( 2004 ). The foundations of mind: Origins of conceptual thought . Oxford, England: Oxford University Press.

Macpherson, R., & Stanovich, K. E. ( 2007 ). Cognitive ability, thinking dispositions, and instructional set as predictors of critical thinking.   Learning and Individual Differences , 54 , 115–127.

McCloskey, M., Caramazza, A., & Green, B. ( 1980 ). Curvilinear motion in the absence of external forces: Naive beliefs about the motion of objects.   Science , 54 , 1139–1141.

McDermott, L. C., & Redish, L. ( 1999 ). Research letter on physics education research.   American Journal of Psychics , 54 , 755.

Mestre, J. P. ( 1991 ). Learning and instruction in pre-college physical science.   Physics Today , 54 , 56–62.

Metz, K. E. ( 1995 ). Reassessment of developmental constraints on children's science instruction.   Review of Educational Research , 54 (2), 93–127.

Minner, D. D., Levy, A. J., & Century, J. ( 2010 ). Inquiry-based science instruction—what is it and does it matter? Results from a research synthesis years 1984 to 2002.   Journal of Research in Science Teaching , 54 (4), 474–496.

Mitchell, T. M. ( 2009 ). Mining our reality.   Science , 54 , 1644–1645.

Mitroff, I. ( 1974 ). The subjective side of science . Amsterdam, Netherlands: Elsevier.

Munakata, Y., Casey, B. J., & Diamond, A. ( 2004 ). Developmental cognitive neuroscience: Progress and potential.   Trends in Cognitive Sciences , 54 , 122–128.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. ( 1977 ) Confirmation bias in a simulated research environment: An experimental study of scientific inference.   Quarterly Journal of Experimental Psychology , 54 , 89–95.

Nersessian, N. ( 1998 ). Conceptual change. In W. Bechtel, & G. Graham (Eds.), A companion to cognitive science (pp. 157–166). London, England: Blackwell.

Nersessian, N. ( 1999 ). Models, mental models, and representations: Model-based reasoning in conceptual change. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 5–22). New York: Plenum.

Nersessian, N. J. ( 2002 ). The cognitive basis of model-based reasoning in science In. P. Carruthers, S. Stich, & M. Siegal (Eds.), The cognitive basis of science (pp. 133–152). New York: Cambridge University Press.

Nersessian, N. J. ( 2008 ) Creating scientific concepts . Cambridge, MA: MIT Press.

O' Malley, M. A. ( 2011 ). Exploration, iterativity and kludging in synthetic biology.   Comptes Rendus Chimie , 54 (4), 406–412 .

Papert, S. ( 1980 ) Mindstorms: Children computers and powerful ideas. New York: Basic Books.

Penner, D. E., & Klahr, D. ( 1996 ). When to trust the data: Further investigations of system error in a scientific reasoning task.   Memory and Cognition , 54 (5), 655–668.

Petitto, L. A., & Dunbar, K. ( 2004 ). New findings from educational neuroscience on bilingual brains, scientific brains, and the educated mind. In K. Fischer & T. Katzir (Eds.), Building usable knowledge in mind, brain, and education Cambridge, England: Cambridge University Press.

Popper, K. R. ( 1959 ). The logic of scientific discovery . London, England: Hutchinson.

Qin, Y., & Simon, H.A. ( 1990 ). Laboratory replication of scientific discovery processes.   Cognitive Science , 54 , 281–312.

Reiser, B. J., Tabak, I., Sandoval, W. A., Smith, B., Steinmuller, F., & Leone, T. J., ( 2001 ). BGuILE: Stategic and conceptual scaffolds for scientific inquiry in biology classrooms. In S. M. Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–306). Mahwah, NJ: Erlbaum

Riordan, M., Rowson, P. C., & Wu, S. L. ( 2001 ). The search for the higgs boson.   Science , 54 , 259–260.

Rutherford, F. J., & Ahlgren, A. ( 1991 ). Science for all Americans. New York: Oxford University Press.

Samarapungavan, A. ( 1992 ). Children's judgments in theory choice tasks: Scientifc rationality in childhood.   Cognition , 54 , 1–32.

Schauble, L., & Glaser, R. ( 1990 ). Scientific thinking in children and adults. In D. Kuhn (Ed.), Developmental perspectives on teaching and learning thinking skills. Contributions to Human Development , (Vol. 21, pp. 9–26). Basel, Switzerland: Karger.

Schunn, C. D., & Klahr, D. ( 1995 ). A 4-space model of scientific discovery. In Proceedings of the 17th Annual Conference of the Cognitive Science Society (pp. 106–111). Mahwah, NJ: Erlbaum.

Schunn, C. D., & Klahr, D. ( 1996 ). The problem of problem spaces: When and how to go beyond a 2-space model of scientific discovery. Part of symposium on Building a theory of problem solving and scientific discovery: How big is N in N-space search? In Proceedings of the 18th Annual Conference of the Cognitive Science Society (pp. 25–26). Mahwah, NJ: Erlbaum.

Shrager, J., & Langley, P. ( 1990 ). Computational models of scientific discovery and theory formation . San Mateo, CA: Morgan Kaufmann.

Siegler, R. S., & Liebert, R. M. ( 1975 ). Acquisition of formal scientific reasoning by 10- and 13-year-olds: Designing a factorial experiment.   Developmental Psychology , 54 , 401–412.

Simon, H. A. ( 1977 ). Models of discovery . Dordrecht, Netherlands: D. Reidel Publishing.

Simon, H. A., Langley, P., & Bradshaw, G. L. ( 1981 ). Scientific discovery as problem solving.   Synthese , 54 , 1–27.

Simon, H. A., & Lea, G. ( 1974 ). Problem solving and rule induction. In H. Simon (Ed.), Models of thought (pp. 329–346). New Haven, CT: Yale University Press.

Smith, E. E., Shafir, E., & Osherson, D. ( 1993 ). Similarity, plausibility, and judgments of probability.   Cognition. Special Issue: Reasoning and decision making , 54 , 67–96.

Sodian, B., Zaitchik, D., & Carey, S. ( 1991 ). Young children's differentiation of hypothetical beliefs from evidence.   Child Development , 54 , 753–766.

Taber, K. S. ( 2009 ). Constructivism and the crisis in U.S. science education: An essay review.   Education Review , 54 (12), 1–26.

Thagard, P. ( 1992 ). Conceptual revolutions . Cambridge, MA: MIT Press.

Thagard, P. ( 1999 ). How scientists explain disease . Princeton, NJ: Princeton University Press.

Thagard, P., & Croft, D. ( 1999 ). Scientific discovery and technological innovation: Ulcers, dinosaur extinction, and the programming language Java. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 125–138). New York: Plenum.

Tobias, S., & Duffy, T. M. (Eds.). ( 2009 ). Constructivist instruction: Success or failure? New York: Routledge.

Toth, E. E., Klahr, D., & Chen, Z. ( 2000 ) Bridging research and practice: A cognitively-based classroom intervention for teaching experimentation skills to elementary school children.   Cognition and Instruction , 54 (4), 423–459.

Tweney, R. D. ( 1989 ). A framework for the cognitive psychology of science. In B. Gholson, A. Houts, R. A. Neimeyer, & W. Shadish (Eds.), Psychology of science: Contributions to metascience (pp. 342–366). Cambridge, England: Cambridge University Press.

Tweney, R. D., Doherty, M. E., & Mynatt, C. R. ( 1981 ). On scientific thinking . New York: Columbia University Press.

Valdes-Perez, R. E. ( 1994 ). Conjecturing hidden entities via simplicity and conservation laws: Machine discovery in chemistry.   Artificial Intelligence , 54 (2), 247–280.

Von Hofsten, C. ( 1980 ). Predictive reaching for moving objects by human infants.   Journal of Experimental Child Psychology , 54 , 369–382.

Von Hofsten, C., Feng, Q., & Spelke, E. S. ( 2000 ). Object representation and predictive action in infancy.   Developmental Science , 54 , 193–205.

Vosnaidou, S. (Ed.). ( 2008 ). International handbook of research on conceptual change . New York: Taylor & Francis.

Vosniadou, S., & Brewer, W. F. ( 1992 ). Mental models of the earth: A study of conceptual change in childhood.   Cognitive Psychology , 54 , 535–585.

Wason, P. C. ( 1968 ). Reasoning about a rule.   Quarterly Journal of Experimental Psychology , 54 , 273–281.

Wertheimer, M. ( 1945 ). Productive thinking . New York: Harper.

Yang, Y. ( 2009 ). Target discovery from data mining approaches.   Drug Discovery Today , 54 (3–4), 147–154.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

1: Introduction to Critical Thinking, Reasoning, and Logic

  • Last updated
  • Save as PDF
  • Page ID 29580

  • Golden West College via NGE Far Press

What is thinking? It may seem strange to begin a logic textbook with this question. ‘Thinking’ is perhaps the most intimate and personal thing that people do. Yet the more you ‘think’ about thinking, the more mysterious it can appear. It is the sort of thing that one intuitively or naturally understands, and yet cannot describe to others without great difficulty. Many people believe that logic is very abstract, dispassionate, complicated, and even cold. But in fact the study of logic is nothing more intimidating or obscure than this: the study of good thinking.

  • 1.1: Prelude to Chapter
  • 1.2: Introduction and Thought Experiments- The Trolley Problem
  • 1.3: Truth and Its Role in Argumentation - Certainty, Probability, and Monty Hall Only certain sorts of sentences can be used in arguments. We call these sentences propositions, statements or claims.
  • 1.4: Distinction of Proof from Verification; Our Biases and the Forer Effect
  • 1.5: The Scientific Method The procedure that scientists use is also a standard form of argument. Its conclusions only give you the likelihood or the probability that something is true (if your theory or hypothesis is confirmed), and not the certainty that it’s true. But when it is done correctly, the conclusions it reaches are very well-grounded in experimental evidence.
  • 1.6: Diagramming Thoughts and Arguments - Analyzing News Media
  • 1.7: Creating a Philosophical Outline

Conceptual review on scientific reasoning and scientific thinking

  • Published: 30 April 2021
  • Volume 42 , pages 4313–4325, ( 2023 )

Cite this article

what's the difference between critical thinking and the scientific method

  • Carlos Díaz 1 , 2 ,
  • Birgit Dorner 3 ,
  • Heinrich Hussmann 4 &
  • Jan-Willem Strijbos 1 , 5  

1638 Accesses

5 Citations

Explore all metrics

When conducting a systematic analysis of the concept of scientific reasoning (SR), we found confusion regarding the definition of the concept, its characteristics and its blurred boundaries with the concept of scientific thinking (ST). Furthermore, some authors use the concepts as synonyms. These findings raised three issues we aimed to answer in the present study: (1) are SR and ST the same concept, (2) if not, what are the differences between them, and (3) how can SR and ST be characterised and operationalised for systematic research? We conducted a conceptual review using an integrative approach to analyse 166 texts. First, we found that thinking and reasoning might refer to different processes. Likewise, SR and ST can be characterised as distinct concepts. Furthermore, the review identified that differences found between the concepts of SR and ST are grounded in ontological and epistemological perspectives.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

what's the difference between critical thinking and the scientific method

Similar content being viewed by others

what's the difference between critical thinking and the scientific method

Representations of Nature of Science in Science Textbooks

what's the difference between critical thinking and the scientific method

Penrose on What Scientists Know

what's the difference between critical thinking and the scientific method

Is There a Scientific Method? The Analytic Model of Science

Abdulkarim, R., & Al Jadiri, A. (2012). The effect of cooperative learning group division based on multiple intelligences theory and previous achievement on scientific thinking skills development of ninth grade students in Oman. European Journal of Social Sciences, 27 (4), 553–569.

Google Scholar  

Abdullah, S., & Shariff, A. (2008). The effects of inquiry-based computer simulation with cooperative learning on scientific thinking and conceptual understanding of gas Laws. Eurasia Journal of Mathematics, Science and Technology Education, 4 (4), 387–398.

Article   Google Scholar  

Acar, O., & Patton, B. (2012). Argumentation and formal reasoning skills in an argumentation-based guided inquiry course. Procedia-Social and Behavioral Sciences, 46 , 4756–4760.

Akkerman, S., Admiraal, W., Brekelmans, M., & Oost, H. (2008). Auditing quality of research in social sciences. Quality & Quantity, 42 (2), 257–274. https://doi.org/10.1007/s11135-006-9044-4 .

Alonso, G. (2013). Revisión del Concepto de Desarrollo Local Desde una Perspectiva Territorial. Revista Líder, 23 , 9–28.

Ato, M., López, J., & Benavente, A. (2013). Un Sistema de Clasificación de los Diseños de Investigación en Psicología. Anales de Psicología, 23 (3), 1038–1059. https://doi.org/10.6018/analesps.29.3.178511 .

Azarpira, N., Amini, M., Kojuri, J., Pasalar, P., Soleimani, M., Kahni, S., et al. (2012). Assessment of scientific thinking in basic science in the Iranian second National Olympiad. BCM Research Notes, 5 (1), 1–7.

Azmitia, M., & Montgomery, R. (1993). Friendship, Transactive dialogues, and the development of scientific reasoning. Social Development, 2 (3), 201–221. https://doi.org/10.1111/j.1467-9507.1993.tb00014.x .

Blair, G., & Goodson, M. (1939). Development of scientific thinking through general science. The School Review, 47 (9), 695–701.

Brigandt, I. (2010). Scientific reasoning is material inference: Combining confirmation, discovery, and explanation. International Studies in Philosophy of Science, 24 (1), 31–43. https://doi.org/10.1080/02698590903467101 .

Callahan, J. (2014). Writing literature reviews: A reprise and update. Human Resource Development Review, 13 (3), 271–275. https://doi.org/10.1177/1534484314536705 .

Camerer, C., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johanesson, M., Kirchler, M., Nave, G., Nosek, B., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., et al. (2018). Evaluating the replicability of social science experiments in nature and science between 2010 and 2015. Nature, 2 , 637–644. https://doi.org/10.1038/s41562-018-0399-z .

Chowdary, K. (2020). Natural language processing. In: Fundamentals of Artificial Intelligence. Springer. https://doi.org/10.1007/978-81-322-3972-7_19 .

Cocking, R., Mestre, J., & Brown, A. (2000). New developments in the science of learning: Using research to help students learn science and mathematics. Journal of Applied Developmental Psychology, 21 (1), 1–11.

Coletta, V., Jeffrey, P., & Steinert, J. (2011). FCI normalized gain, scientific reasoning ability, thinking in physics, and gender effects. Proceedings of the 2011 Physics Education Research Conference, 1413 , 23–26. https://doi.org/10.1063/1.3679984 .

Crowley, K., Callanan, M., Jipson, J., Galco, J., Topping, K., & Shrager, J. (2001). Shared scientific thinking in everyday parent-child activity. Science Education, 85 (6), 712–732. https://doi.org/10.1002/sce.1035 .

D’Mello, S., Lehman, B., Pekrun, R., & Graesser, A. (2014). Confusion can be beneficial for learning. Learning and Instruction, 29 , 153–170. https://doi.org/10.1016/j.learninstruc.2012.05.003 .

Dejonckheere, P., van de Keere, K., & Mestdagh, N. (2010). Training the scientific thinking circle in pre- and primary school. The Journal of Education Research, 103 , 1–16. https://doi.org/10.1080/00220670903228595 .

Downing, E. (1928). The elements and safeguards of scientific thinking. The Scientific Monthly, 26 (3), 231–243.

Dunbar, K. (2001). Chapter 5. What scientific thinking reveals about the nature of cognition. In K. Crowley, C. Schunn, & T. Okada (Eds.), Designing for Science: Implications from Everyday, Classroom, and Professional Settings (pp. 115–140) . Psychology Press.

Dunbar, K., & Klahr, D. (2012). Chapter 35. Scientific thinking and reasoning. In K. Holyoak & R. Morrison (Eds.), Oxford Handbook of Thinking and Reasoning (2012th ed.) . Oxford University Press.

Echevarria, M. (2003). Anomalies as a catalyst for middle school students’ knowledge construction and scientific reasoning during science inquiry. Journal of Educational Psychology, 95 (2), 357–374.

Eysenck, M., & Keane, M. (2003). Cognitive psychology a Student’s handbook (4th ed.) . Psychology Press.

Faulkner, D., Joiner, R., Littleton, K., Miell, D., & Thompson, L. (2000). The mediating effect of task presentation on collaboration and Children’s Acquisition of Scientific Reasoning. European Journal of Psychology of Education, 15 (4), 417–430. https://doi.org/10.1007/BF03172985 .

Fischer, F., Kollar, I., Ufer, S., Sodian, B., Hussmann, H., Pekrun, R., et al. (2014). Scientific reasoning and argumentation: Advancing an interdisciplinary research agenda in education. Frontline Learning Research, 2 (3), 28–45. https://doi.org/10.14786/flr.v2i2.96 .

Gallardo-Echenique, E., Marqués-Molías, L., Bullen, M., & Strijbos, J.-W. (2015). Let’s talk about digital learners in the digital era. The International Review of Research in Open and Distance Learning, 16 (3), 156–187.

Garnham, A., & Oakhill, J. (1994). Thinking and reasoning . Basil Blackwell Inc..

Gee, J. P. (2007). What video games have to teach us about learning and literacy . Library of Congress.

Gower, B. (1997). Henri Poincare and Bruno de Finetti: Conventions and scientific reasoning. Studies in History and Philosophy of Sciences, 28 (4), 657–679.

Harre, R. (2004). Chapter 3. Properties and images. In D. Rothbart (Ed.), Modeling: Gateway to the Unknown (pp. 29–45) . Elsevier Inc..

Holyoak, K., & Morrison, M. (2005). The Cambridge handbook of thinking and reasoning . Cambridge University Press.

Iliev, R., Dehghani, M., & Sagi, E. (2015). Automated text analysis in psychology: Methods, applications, and future developments. Language and Cognition, 7 , 265–290. https://doi.org/10.1017/langcog.2014.30 .

Jirout, J., & Klahr, D. (2012). Children’s scientific curiosity: In search of an operational definition of an elusive concept. Developmental Review, 32 , 125–160. https://doi.org/10.1016/j.dr.2012.04.002 .

Johnson-Laird, P., & Byrne, R. (1993). Mental models or formal rules? Behavioral and Brain Sciences, 16 (2), 368–380. https://doi.org/10.1017/S0140525X0003065X .

Kahnemann, D., & Tversky, A. (1979). Prospect theory: An analysis of decision making under risk. Econometrica, 47 , 263–291.

Kisiel, J., Rowe, S., Vartabedian, M., & Kopczak, C. (2012). Evidence for family engagement in scientific reasoning at interactive animal exhibits. Science Education, 96 (6), 1047–1070. https://doi.org/10.1002/sce.21036 .

Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Sicence, 12 (1), 1–48.

Kuhn, D. (2009). Do students need to be taught how to reason? Educational Research Review, 4 , 1–6. https://doi.org/10.1016/j.edurev.2008.11.001 .

Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23 , 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 .

Linn, M., & Rice, M. (1979). A measure of scientific reasoning: The springs task. Journal of Educational Measurement, 16 (1), 55–58.

Loewenstein, G. (1994). The psychology of curiosity: A review and reinterpretation. Psychological Bulletin, 116 (1), 75–98.

Machado, A., & Silva, F. J. (2007). Toward a richer view of the scientific method: The role of conceptual analysis. American Psychologist, 62 (7), 671–681. https://doi.org/10.1037/0003-066X.62.7.671 .

Article   PubMed   Google Scholar  

Magno, C. (2011). Assessing the relationship of scientific thinking, self-regulation in research, and creativity in a measurement model. The International Journal of Research and Review, 6 (1), 17–47.

Mayring, P. (2014). Qualitative content analysis: Theoretical foundation, basic procedures and software solution . Retrieved from http://nbn-resolving.de/urn:nbn:de:0168-ssoar-395173 . Accessed 4 May 2016.

Miller, G. (1983). Is scientific thinking different? Bulletin of the American Academy of Arts and Sciences, 36 (5), 26–37.

Newell, A., & Simon, H. (1972). Human Problem Solving . Prentice-Hall.

Nichol, C., Szymczyk, A., & Hutchinson, J. (2014). Data first: Building scientific reasoning in AP chemistry via the concept development study approach. Journal of Chemical Education, 91 , 1318–1325. https://doi.org/10.1021/ed500027g .

Oxford Dictionary (2020). ‘Scientific method’ in Oxford dictionary [online version]. Retrieved from http://www.oxforddictionaries.com/definition/english/scientific-method . Accessed 27 Jul 2016.

Patterson, C. (1994). Delineation of separate brain regions used for scientific versus engineering modes of thinking. Geochimica et Cosmochimica Acta, 58 (15), 3321–3327.

Piraksa, C., Srisawasdi, N., & Koul, R. (2014). Effect of gender on students’ scientific reasoning ability: A case study in Thailand. Procedia - Social and Behavioral Sciences, 116 , 486–491. https://doi.org/10.1016/j.sbspro.2014.01.245 .

Popper, K. (1959). The logic of scientific discovery (2002nd ed.) . Routledge Classics.

Popper, K. (1962). Conjectures and refutations. The growth of scientific knowledge (2014th ed.) . Routledge.

Popper, K. (1966). A realist view of logic, physics, and history. In Objective Knowledge: An Evolutionary Approach (1972nd ed.) . Oxford University Press.

Pyper, B. (2012). Changing Scientific Reasoning And Conceptual Understanding In College Students. Proceedings of the 2011 Physics education research conference . Presented at the Physics Education Research Conference.

Ruphy, S. (2011). From Hacking’s plurality of styles of scientific reasoning to “foliated” pluralism: A philosophically robust form of Ontologico-methodological pluralism. Philosophy of Science, 78 (5), 1212–1222 0031-8248/2011/7805-0041$10.00.

Sinoara, R., Antunes, J., & Rezende, S. (2017). Text mining and semantics: A systematic mapping study. Journal of the Brazilian Computer Society, 23 (9), 1–20. https://doi.org/10.1186/s13173-017-0058-7 .

Soares, C., Hoga, L., Peduzzi, M., Sangaleti, C., Yonekura, T., & Silva, D. (2014). Integrative review: Concepts and methods used in nursing. Revista Da Escola de Enfermagem Da USP, 48 (2), 329–339. https://doi.org/10.1590/S0080-623420140000200020 .

Steinkuehler, C. (2010). Video games and digital literacies. Journal of Adolescent and Adult Literacy, 54 (1), 61–63. https://doi.org/10.1598/JAAL54.1.7 .

Thoron, A., & Myers, B. (2012). Effects of inquiry–based Agriscience instruction on student scientific reasoning. Journal of Agricultural Education, 52 (4), 156–170.

Torraco, R. (2016). Writing integrative literature reviews: Using the past and present to explore the future. Human Resource Development Review, 15 (4), 404–428. https://doi.org/10.1177/1534484316671606 .

Toubes, A., Santos, H., Llosa, S., & Llomagno, C. (2006). Revisión del concepto de Educación No Formal . Facultad de Filosofía y Letras UBA.

Turiman, P., Omar, J., Daud, A., & Osman, K. (2012). Fostering the 21st century skills through scientific literacy and science process skills. Procedia - Social and Behavioral Sciences, 59 , 110–116. https://doi.org/10.1016/j.sbspro.2012.09.253 .

Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27 (2), 172–223.

Download references

Acknowledgements

This research was supported by XXX [Project number: NNN], Institution1, and Institution2.

Data Accessibility Statement

The authors declare that the data supporting the findings of this study as well as the coding schema used for analysing it are available within the article, and the article’s supplementary information files.

Author information

Authors and affiliations.

Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany

Carlos Díaz & Jan-Willem Strijbos

Interacting Minds Centre, Aarhus University, Jens Chr. Skous Vej 7, 4. floor, 8000, Aarhus, Denmark

Carlos Díaz

Department of Social Work, Katholische Stiftungshochschule München, Munich, Germany

Birgit Dorner

Media Informatics Group, Ludwig-Maximilians-Universität München, Munich, Germany

Heinrich Hussmann

Department of Educational Sciences, University of Groningen, Groningen, the Netherlands

Jan-Willem Strijbos

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Carlos Díaz .

Ethics declarations

Conflict of interest.

The authors whose names are listed immediately below certify that they have NO affiliations with or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent-licensing arrangements), or non-financial interest (such as personal or professional relationships, affiliations, knowledge or beliefs) in the subject matter or materials discussed in this manuscript. We declare that all the authors of the manuscript have read and agreed with the conflict of interest statement.

Author Names

[Anonymised].

Ethical Approval

The manuscript did not require of ethical approval from a board as the research is bibliographic and did not include living subjects.

All the ethical considerations and guidelines for a high standard research within the Social Sciences and humanities have been followed. Including a system of review based on arbitration.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

(DOCX 34 kb)

Rights and permissions

Reprints and permissions

About this article

Díaz, C., Dorner, B., Hussmann, H. et al. Conceptual review on scientific reasoning and scientific thinking. Curr Psychol 42 , 4313–4325 (2023). https://doi.org/10.1007/s12144-021-01786-5

Download citation

Accepted : 21 April 2021

Published : 30 April 2021

Issue Date : February 2023

DOI : https://doi.org/10.1007/s12144-021-01786-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Scientific reasoning
  • Scientific thinking
  • Concept review
  • Find a journal
  • Publish with us
  • Track your research

What Are The Steps Of The Scientific Method?

Julia Simkus

Editor at Simply Psychology

BA (Hons) Psychology, Princeton University

Julia Simkus is a graduate of Princeton University with a Bachelor of Arts in Psychology. She is currently studying for a Master's Degree in Counseling for Mental Health and Wellness in September 2023. Julia's research has been published in peer reviewed journals.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Science is not just knowledge. It is also a method for obtaining knowledge. Scientific understanding is organized into theories.

The scientific method is a step-by-step process used by researchers and scientists to determine if there is a relationship between two or more variables. Psychologists use this method to conduct psychological research, gather data, process information, and describe behaviors.

It involves careful observation, asking questions, formulating hypotheses, experimental testing, and refining hypotheses based on experimental findings.

How it is Used

The scientific method can be applied broadly in science across many different fields, such as chemistry, physics, geology, and psychology. In a typical application of this process, a researcher will develop a hypothesis, test this hypothesis, and then modify the hypothesis based on the outcomes of the experiment.

The process is then repeated with the modified hypothesis until the results align with the observed phenomena. Detailed steps of the scientific method are described below.

Keep in mind that the scientific method does not have to follow this fixed sequence of steps; rather, these steps represent a set of general principles or guidelines.

7 Steps of the Scientific Method

Psychology uses an empirical approach.

Empiricism (founded by John Locke) states that the only source of knowledge comes through our senses – e.g., sight, hearing, touch, etc.

Empirical evidence does not rely on argument or belief. Thus, empiricism is the view that all knowledge is based on or may come from direct observation and experience.

The empiricist approach of gaining knowledge through experience quickly became the scientific approach and greatly influenced the development of physics and chemistry in the 17th and 18th centuries.

Steps of the Scientific Method

Step 1: Make an Observation (Theory Construction)

Every researcher starts at the very beginning. Before diving in and exploring something, one must first determine what they will study – it seems simple enough!

By making observations, researchers can establish an area of interest. Once this topic of study has been chosen, a researcher should review existing literature to gain insight into what has already been tested and determine what questions remain unanswered.

This assessment will provide helpful information about what has already been comprehended about the specific topic and what questions remain, and if one can go and answer them.

Specifically, a literature review might implicate examining a substantial amount of documented material from academic journals to books dating back decades. The most appropriate information gathered by the researcher will be shown in the introduction section or abstract of the published study results.

The background material and knowledge will help the researcher with the first significant step in conducting a psychology study, which is formulating a research question.

This is the inductive phase of the scientific process. Observations yield information that is used to formulate theories as explanations. A theory is a well-developed set of ideas that propose an explanation for observed phenomena.

Inductive reasoning moves from specific premises to a general conclusion. It starts with observations of phenomena in the natural world and derives a general law.

Step 2: Ask a Question

Once a researcher has made observations and conducted background research, the next step is to ask a scientific question. A scientific question must be defined, testable, and measurable.

A useful approach to develop a scientific question is: “What is the effect of…?” or “How does X affect Y?”

To answer an experimental question, a researcher must identify two variables: the independent and dependent variables.

The independent variable is the variable manipulated (the cause), and the dependent variable is the variable being measured (the effect).

An example of a research question could be, “Is handwriting or typing more effective for retaining information?” Answering the research question and proposing a relationship between the two variables is discussed in the next step.

Step 3: Form a Hypothesis (Make Predictions)

A hypothesis is an educated guess about the relationship between two or more variables. A hypothesis is an attempt to answer your research question based on prior observation and background research. Theories tend to be too complex to be tested all at once; instead, researchers create hypotheses to test specific aspects of a theory.

For example, a researcher might ask about the connection between sleep and educational performance. Do students who get less sleep perform worse on tests at school?

It is crucial to think about different questions one might have about a particular topic to formulate a reasonable hypothesis. It would help if one also considered how one could investigate the causalities.

It is important that the hypothesis is both testable against reality and falsifiable. This means that it can be tested through an experiment and can be proven wrong.

The falsification principle, proposed by Karl Popper , is a way of demarcating science from non-science. It suggests that for a theory to be considered scientific, it must be able to be tested and conceivably proven false.

To test a hypothesis, we first assume that there is no difference between the populations from which the samples were taken. This is known as the null hypothesis and predicts that the independent variable will not influence the dependent variable.

Examples of “if…then…” Hypotheses:

  • If one gets less than 6 hours of sleep, then one will do worse on tests than if one obtains more rest.
  • If one drinks lots of water before going to bed, one will have to use the bathroom often at night.
  • If one practices exercising and lighting weights, then one’s body will begin to build muscle.

The research hypothesis is often called the alternative hypothesis and predicts what change(s) will occur in the dependent variable when the independent variable is manipulated.

It states that the results are not due to chance and that they are significant in terms of supporting the theory being investigated.

Although one could state and write a scientific hypothesis in many ways, hypotheses are usually built like “if…then…” statements.

Step 4: Run an Experiment (Gather Data)

The next step in the scientific method is to test your hypothesis and collect data. A researcher will design an experiment to test the hypothesis and gather data that will either support or refute the hypothesis.

The exact research methods used to examine a hypothesis depend on what is being studied. A psychologist might utilize two primary forms of research, experimental research, and descriptive research.

The scientific method is objective in that researchers do not let preconceived ideas or biases influence the collection of data and is systematic in that experiments are conducted in a logical way.

Experimental Research

Experimental research is used to investigate cause-and-effect associations between two or more variables. This type of research systematically controls an independent variable and measures its effect on a specified dependent variable.

Experimental research involves manipulating an independent variable and measuring the effect(s) on the dependent variable. Repeating the experiment multiple times is important to confirm that your results are accurate and consistent.

One of the significant advantages of this method is that it permits researchers to determine if changes in one variable cause shifts in each other.

While experiments in psychology typically have many moving parts (and can be relatively complex), an easy investigation is rather fundamental. Still, it does allow researchers to specify cause-and-effect associations between variables.

Most simple experiments use a control group, which involves those who do not receive the treatment, and an experimental group, which involves those who do receive the treatment.

An example of experimental research would be when a pharmaceutical company wants to test a new drug. They give one group a placebo (control group) and the other the actual pill (experimental group).

Descriptive Research

Descriptive research is generally used when it is challenging or even impossible to control the variables in question. Examples of descriptive analysis include naturalistic observation, case studies , and correlation studies .

One example of descriptive research includes phone surveys that marketers often use. While they typically do not allow researchers to identify cause and effect, correlational studies are quite common in psychology research. They make it possible to spot associations between distinct variables and measure the solidity of those relationships.

Step 5: Analyze the Data and Draw Conclusions

Once a researcher has designed and done the investigation and collected sufficient data, it is time to inspect this gathered information and judge what has been found. Researchers can summarize the data, interpret the results, and draw conclusions based on this evidence using analyses and statistics.

Upon completion of the experiment, you can collect your measurements and analyze the data using statistics. Based on the outcomes, you will either reject or confirm your hypothesis.

Analyze the Data

So, how does a researcher determine what the results of their study mean? Statistical analysis can either support or refute a researcher’s hypothesis and can also be used to determine if the conclusions are statistically significant.

When outcomes are said to be “statistically significant,” it is improbable that these results are due to luck or chance. Based on these observations, investigators must then determine what the results mean.

An experiment will support a hypothesis in some circumstances, but sometimes it fails to be truthful in other cases.

What occurs if the developments of a psychology investigation do not endorse the researcher’s hypothesis? It does mean that the study was worthless. Simply because the findings fail to defend the researcher’s hypothesis does not mean that the examination is not helpful or instructive.

This kind of research plays a vital role in supporting scientists in developing unexplored questions and hypotheses to investigate in the future. After decisions have been made, the next step is to communicate the results with the rest of the scientific community.

This is an integral part of the process because it contributes to the general knowledge base and can assist other scientists in finding new research routes to explore.

If the hypothesis is not supported, a researcher should acknowledge the experiment’s results, formulate a new hypothesis, and develop a new experiment.

We must avoid any reference to results proving a theory as this implies 100% certainty, and there is always a chance that evidence may exist that could refute a theory.

Draw Conclusions and Interpret the Data

When the empirical observations disagree with the hypothesis, a number of possibilities must be considered. It might be that the theory is incorrect, in which case it needs altering, so it fully explains the data.

Alternatively, it might be that the hypothesis was poorly derived from the original theory, in which case the scientists were expecting the wrong thing to happen.

It might also be that the research was poorly conducted, or used an inappropriate method, or there were factors in play that the researchers did not consider. This will begin the process of the scientific method again.

If the hypothesis is supported, the researcher can find more evidence to support their hypothesis or look for counter-evidence to strengthen their hypothesis further.

In either scenario, the researcher should share their results with the greater scientific community.

Step 6: Share Your Results

One of the final stages of the research cycle involves the publication of the research. Once the report is written, the researcher(s) may submit the work for publication in an appropriate journal.

Usually, this is done by writing up a study description and publishing the article in a professional or academic journal. The studies and conclusions of psychological work can be seen in peer-reviewed journals such as  Developmental Psychology , Psychological Bulletin, the  Journal of Social Psychology, and numerous others.

Scientists should report their findings by writing up a description of their study and any subsequent findings. This enables other researchers to build upon the present research or replicate the results.

As outlined by the American Psychological Association (APA), there is a typical structure of a journal article that follows a specified format. In these articles, researchers:

  • Supply a brief narrative and background on previous research
  • Give their hypothesis
  • Specify who participated in the study and how they were chosen
  • Provide operational definitions for each variable
  • Explain the measures and methods used to collect data
  • Describe how the data collected was interpreted
  • Discuss what the outcomes mean

A detailed record of psychological studies and all scientific studies is vital to clearly explain the steps and procedures used throughout the study. So that other researchers can try this experiment too and replicate the results.

The editorial process utilized by academic and professional journals guarantees that each submitted article undergoes a thorough peer review to help assure that the study is scientifically sound. Once published, the investigation becomes another piece of the current puzzle of our knowledge “base” on that subject.

This last step is important because all results, whether they supported or did not support the hypothesis, can contribute to the scientific community. Publication of empirical observations leads to more ideas that are tested against the real world, and so on. In this sense, the scientific process is circular.

The editorial process utilized by academic and professional journals guarantees that each submitted article undergoes a thorough peer review to help assure that the study is scientifically sound.

Once published, the investigation becomes another piece of the current puzzle of our knowledge “base” on that subject.

By replicating studies, psychologists can reduce errors, validate theories, and gain a stronger understanding of a particular topic.

Step 7: Repeat the Scientific Method (Iteration)

Now, if one’s hypothesis turns out to be accurate, find more evidence or find counter-evidence. If one’s hypothesis is false, create a new hypothesis or try again.

One may wish to revise their first hypothesis to make a more niche experiment to design or a different specific question to test.

The amazingness of the scientific method is that it is a comprehensive and straightforward process that scientists, and everyone, can utilize over and over again.

So, draw conclusions and repeat because the scientific method is never-ending, and no result is ever considered perfect.

The scientific method is a process of:

  • Making an observation.
  • Forming a hypothesis.
  • Making a prediction.
  • Experimenting to test the hypothesis.

The procedure of repeating the scientific method is crucial to science and all fields of human knowledge.

Further Information

  • Karl Popper – Falsification
  • Thomas – Kuhn Paradigm Shift
  • Positivism in Sociology: Definition, Theory & Examples
  • Is Psychology a Science?
  • Psychology as a Science (PDF)

List the 6 steps of the scientific methods in order

  • Make an observation (theory construction)
  • Ask a question. A scientific question must be defined, testable, and measurable.
  • Form a hypothesis (make predictions)
  • Run an experiment to test the hypothesis (gather data)
  • Analyze the data and draw conclusions
  • Share your results so that other researchers can make new hypotheses

What is the first step of the scientific method?

The first step of the scientific method is making an observation. This involves noticing and describing a phenomenon or group of phenomena that one finds interesting and wishes to explain.

Observations can occur in a natural setting or within the confines of a laboratory. The key point is that the observation provides the initial question or problem that the rest of the scientific method seeks to answer or solve.

What is the scientific method?

The scientific method is a step-by-step process that investigators can follow to determine if there is a causal connection between two or more variables.

Psychologists and other scientists regularly suggest motivations for human behavior. On a more casual level, people judge other people’s intentions, incentives, and actions daily.

While our standard assessments of human behavior are subjective and anecdotal, researchers use the scientific method to study psychology objectively and systematically.

All utilize a scientific method to study distinct aspects of people’s thinking and behavior. This process allows scientists to analyze and understand various psychological phenomena, but it also provides investigators and others a way to disseminate and debate the results of their studies.

The outcomes of these studies are often noted in popular media, which leads numerous to think about how or why researchers came to the findings they did.

Why Use the Six Steps of the Scientific Method

The goal of scientists is to understand better the world that surrounds us. Scientific research is the most critical tool for navigating and learning about our complex world.

Without it, we would be compelled to rely solely on intuition, other people’s power, and luck. We can eliminate our preconceived concepts and superstitions through methodical scientific research and gain an objective sense of ourselves and our world.

All psychological studies aim to explain, predict, and even control or impact mental behaviors or processes. So, psychologists use and repeat the scientific method (and its six steps) to perform and record essential psychological research.

So, psychologists focus on understanding behavior and the cognitive (mental) and physiological (body) processes underlying behavior.

In the real world, people use to understand the behavior of others, such as intuition and personal experience. The hallmark of scientific research is evidence to support a claim.

Scientific knowledge is empirical, meaning it is grounded in objective, tangible evidence that can be observed repeatedly, regardless of who is watching.

The scientific method is crucial because it minimizes the impact of bias or prejudice on the experimenter. Regardless of how hard one tries, even the best-intentioned scientists can’t escape discrimination. can’t

It stems from personal opinions and cultural beliefs, meaning any mortal filters data based on one’s experience. Sadly, this “filtering” process can cause a scientist to favor one outcome over another.

For an everyday person trying to solve a minor issue at home or work, succumbing to these biases is not such a big deal; in fact, most times, it is important.

But in the scientific community, where results must be inspected and reproduced, bias or discrimination must be avoided.

When to Use the Six Steps of the Scientific Method ?

One can use the scientific method anytime, anywhere! From the smallest conundrum to solving global problems, it is a process that can be applied to any science and any investigation.

Even if you are not considered a “scientist,” you will be surprised to know that people of all disciplines use it for all kinds of dilemmas.

Try to catch yourself next time you come by a question and see how you subconsciously or consciously use the scientific method.

Print Friendly, PDF & Email

Posted on the SDB Web Site Monday, July 26, 1999, Modified Wednesday, December 27, 2000

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PLoS Comput Biol
  • v.15(9); 2019 Sep

Logo of ploscomp

Perspective: Dimensions of the scientific method

Eberhard o. voit.

Department of Biomedical Engineering, Georgia Institute of Technology and Emory University, Atlanta, Georgia, United States of America

The scientific method has been guiding biological research for a long time. It not only prescribes the order and types of activities that give a scientific study validity and a stamp of approval but also has substantially shaped how we collectively think about the endeavor of investigating nature. The advent of high-throughput data generation, data mining, and advanced computational modeling has thrown the formerly undisputed, monolithic status of the scientific method into turmoil. On the one hand, the new approaches are clearly successful and expect the same acceptance as the traditional methods, but on the other hand, they replace much of the hypothesis-driven reasoning with inductive argumentation, which philosophers of science consider problematic. Intrigued by the enormous wealth of data and the power of machine learning, some scientists have even argued that significant correlations within datasets could make the entire quest for causation obsolete. Many of these issues have been passionately debated during the past two decades, often with scant agreement. It is proffered here that hypothesis-driven, data-mining–inspired, and “allochthonous” knowledge acquisition, based on mathematical and computational models, are vectors spanning a 3D space of an expanded scientific method. The combination of methods within this space will most certainly shape our thinking about nature, with implications for experimental design, peer review and funding, sharing of result, education, medical diagnostics, and even questions of litigation.

The traditional scientific method: Hypothesis-driven deduction

Research is the undisputed core activity defining science. Without research, the advancement of scientific knowledge would come to a screeching halt. While it is evident that researchers look for new information or insights, the term “research” is somewhat puzzling. Never mind the prefix “re,” which simply means “coming back and doing it again and again,” the word “search” seems to suggest that the research process is somewhat haphazard, that not much of a strategy is involved in the process. One might argue that research a few hundred years ago had the character of hoping for enough luck to find something new. The alchemists come to mind in their quest to turn mercury or lead into gold, or to discover an elixir for eternal youth, through methods we nowadays consider laughable.

Today’s sciences, in stark contrast, are clearly different. Yes, we still try to find something new—and may need a good dose of luck—but the process is anything but unstructured. In fact, it is prescribed in such rigor that it has been given the widely known moniker “scientific method.” This scientific method has deep roots going back to Aristotle and Herophilus (approximately 300 BC), Avicenna and Alhazen (approximately 1,000 AD), Grosseteste and Robert Bacon (approximately 1,250 AD), and many others, but solidified and crystallized into the gold standard of quality research during the 17th and 18th centuries [ 1 – 7 ]. In particular, Sir Francis Bacon (1561–1626) and René Descartes (1596–1650) are often considered the founders of the scientific method, because they insisted on careful, systematic observations of high quality, rather than metaphysical speculations that were en vogue among the scholars of the time [ 1 , 8 ]. In contrast to their peers, they strove for objectivity and insisted that observations, rather than an investigator’s preconceived ideas or superstitions, should be the basis for formulating a research idea [ 7 , 9 ].

Bacon and his 19th century follower John Stuart Mill explicitly proposed gaining knowledge through inductive reasoning: Based on carefully recorded observations, or from data obtained in a well-planned experiment, generalized assertions were to be made about similar yet (so far) unobserved phenomena [ 7 ]. Expressed differently, inductive reasoning attempts to derive general principles or laws directly from empirical evidence [ 10 ]. An example is the 19th century epigram of the physician Rudolf Virchow, Omnis cellula e cellula . There is no proof that indeed “every cell derives from a cell,” but like Virchow, we have made the observation time and again and never encountered anything suggesting otherwise.

In contrast to induction, the widely accepted, traditional scientific method is based on formulating and testing hypotheses. From the results of these tests, a deduction is made whether the hypothesis is presumably true or false. This type of hypotheticodeductive reasoning goes back to William Whewell, William Stanley Jevons, and Charles Peirce in the 19th century [ 1 ]. By the 20th century, the deductive, hypothesis-based scientific method had become deeply ingrained in the scientific psyche, and it is now taught as early as middle school in order to teach students valid means of discovery [ 8 , 11 , 12 ]. The scientific method has not only guided most research studies but also fundamentally influenced how we think about the process of scientific discovery.

Alas, because biology has almost no general laws, deduction in the strictest sense is difficult. It may therefore be preferable to use the term abduction, which refers to the logical inference toward the most plausible explanation, given a set of observations, although this explanation cannot be proven and is not necessarily true.

Over the decades, the hypothesis-based scientific method did experience variations here and there, but its conceptual scaffold remained essentially unchanged ( Fig 1 ). Its key is a process that begins with the formulation of a hypothesis that is to be rigorously tested, either in the wet lab or computationally; nonadherence to this principle is seen as lacking rigor and can lead to irreproducible results [ 1 , 13 – 15 ].

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007279.g001.jpg

The central concept of the traditional scientific method is a falsifiable hypothesis regarding some phenomenon of interest. This hypothesis is to be tested experimentally or computationally. The test results support or refute the hypothesis, triggering a new round of hypothesis formulation and testing.

Going further, the prominent philosopher of science Sir Karl Popper argued that a scientific hypothesis can never be verified but that it can be disproved by a single counterexample. He therefore demanded that scientific hypotheses had to be falsifiable, because otherwise, testing would be moot [ 16 , 17 ] (see also [ 18 ]). As Gillies put it, “successful theories are those that survive elimination through falsification” [ 19 ]. Kelley and Scott agreed to some degree but warned that complete insistence on falsifiability is too restrictive as it would mark many computational techniques, statistical hypothesis testing, and even Darwin’s theory of evolution as nonscientific [ 20 ].

While the hypothesis-based scientific method has been very successful, its exclusive reliance on deductive reasoning is dangerous because according to the so-called Duhem–Quine thesis, hypothesis testing always involves an unknown number of explicit or implicit assumptions, some of which may steer the researcher away from hypotheses that seem implausible, although they are, in fact, true [ 21 ]. According to Kuhn, this bias can obstruct the recognition of paradigm shifts [ 22 ], which require the rethinking of previously accepted “truths” and the development of radically new ideas [ 23 , 24 ]. The testing of simultaneous alternative hypotheses [ 25 – 27 ] ameliorates this problem to some degree but not entirely.

The traditional scientific method is often presented in discrete steps, but it should really be seen as a form of critical thinking, subject to review and independent validation [ 8 ]. It has proven very influential, not only by prescribing valid experimentation, but also for affecting the way we attempt to understand nature [ 18 ], for teaching [ 8 , 12 ], reporting, publishing, and otherwise sharing information [ 28 ], for peer review and the awarding of funds by research-supporting agencies [ 29 , 30 ], for medical diagnostics [ 7 ], and even in litigation [ 31 ].

A second dimension of the scientific method: Data-mining–inspired induction

A major shift in biological experimentation occurred with the–omics revolution of the early 21st century. All of a sudden, it became feasible to perform high-throughput experiments that generated thousands of measurements, typically characterizing the expression or abundances of very many—if not all—genes, proteins, metabolites, or other biological quantities in a sample.

The strategy of measuring large numbers of items in a nontargeted fashion is fundamentally different from the traditional scientific method and constitutes a new, second dimension of the scientific method. Instead of hypothesizing and testing whether gene X is up-regulated under some altered condition, the leading question becomes which of the thousands of genes in a sample are up- or down-regulated. This shift in focus elevates the data to the supreme role of revealing novel insights by themselves ( Fig 2 ). As an important, generic advantage over the traditional strategy, this second dimension is free of a researcher’s preconceived notions regarding the molecular mechanisms governing the phenomenon of interest, which are otherwise the key to formulating a hypothesis. The prominent biologists Patrick Brown and David Botstein commented that “the patterns of expression will often suffice to begin de novo discovery of potential gene functions” [ 32 ].

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007279.g002.jpg

Data-driven research begins with an untargeted exploration, in which the data speak for themselves. Machine learning extracts patterns from the data, which suggest hypotheses that are to be tested in the lab or computationally.

This data-driven, discovery-generating approach is at once appealing and challenging. On the one hand, very many data are explored simultaneously and essentially without bias. On the other hand, the large datasets supporting this approach create a genuine challenge to understanding and interpreting the experimental results because the thousands of data points, often superimposed with a fair amount of noise, make it difficult to detect meaningful differences between sample and control. This situation can only be addressed with computational methods that first “clean” the data, for instance, through the statistically valid removal of outliers, and then use machine learning to identify statistically significant, distinguishing molecular profiles or signatures. In favorable cases, such signatures point to specific biological pathways, whereas other signatures defy direct explanation but may become the launch pad for follow-up investigations [ 33 ].

Today’s scientists are very familiar with this discovery-driven exploration of “what’s out there” and might consider it a quaint quirk of history that this strategy was at first widely chastised and ridiculed as a “fishing expedition” [ 30 , 34 ]. Strict traditionalists were outraged that rigor was leaving science with the new approach and that sufficient guidelines were unavailable to assure the validity and reproducibility of results [ 10 , 35 , 36 ].

From the view point of philosophy of science, this second dimension of the scientific method uses inductive reasoning and reflects Bacon’s idea that observations can and should dictate the research question to be investigated [ 1 , 7 ]. Allen [ 36 ] forcefully rejected this type of reasoning, stating “the thinking goes, we can now expect computer programs to derive significance, relevance and meaning from chunks of information, be they nucleotide sequences or gene expression profiles… In contrast with this view, many are convinced that no purely logical process can turn observation into understanding.” His conviction goes back to the 18th century philosopher David Hume and again to Popper, who identified as the overriding problem with inductive reasoning that it can never truly reveal causality, even if a phenomenon is observed time and again [ 16 , 17 , 37 , 38 ]. No number of observations, even if they always have the same result, can guard against an exception that would violate the generality of a law inferred from these observations [ 1 , 35 ]. Worse, Popper argued, through inference by induction, we cannot even know the probability of something being true [ 10 , 17 , 36 ].

Others argued that data-driven and hypothesis-driven research actually do not differ all that much in principle, as long as there is cycling between developing new ideas and testing them with care [ 27 ]. In fact, Kell and Oliver [ 34 ] maintained that the exclusive acceptance of hypothesis-driven programs misrepresents the complexities of biological knowledge generation. Similarly refuting the prominent rule of deduction, Platt [ 26 ] and Beard and Kushmerick [ 27 ] argued that repeated inductive reasoning, called strong inference, corresponds to a logically sound decision tree of disproving or refining hypotheses that can rapidly yield firm conclusions; nonetheless, Platt had to admit that inductive inference is not as certain as deduction, because it projects into the unknown. Lander compared the task of obtaining causality by induction to the problem of inferring the design of a microprocessor from input-output readings, which in a strict sense is impossible, because the microprocessor could be arbitrarily complicated; even so, inference often leads to novel insights and therefore is valuable [ 39 ].

An interesting special case of almost pure inductive reasoning is epidemiology, where hypothesis-driven reasoning is rare and instead, the fundamental question is whether data-based evidence is sufficient to associate health risks with specific causes [ 31 , 34 ].

Recent advances in machine learning and “big-data” mining have driven the use of inductive reasoning to unprecedented heights. As an example, machine learning can greatly assist in the discovery of patterns, for instance, in biological sequences [ 40 ]. Going a step further, a pithy article by Andersen [ 41 ] proffered that we may not need to look for causality or mechanistic explanations anymore if we just have enough correlation: “With enough data, the numbers speak for themselves, correlation replaces causation, and science can advance even without coherent models or unified theories.”

Of course, the proposal to abandon the quest for causality caused pushback on philosophical as well as mathematical grounds. Allen [ 10 , 35 ] considered the idea “absurd” that data analysis could enhance understanding in the absence of a hypothesis. He felt confident “that even the formidable combination of computing power with ease of access to data cannot produce a qualitative shift in the way that we do science: the making of hypotheses remains an indispensable component in the growth of knowledge” [ 36 ]. Succi and Coveney [ 42 ] refuted the “most extravagant claims” of big-data proponents very differently, namely by analyzing the theories on which machine learning is founded. They contrasted the assumptions underlying these theories, such as the law of large numbers, with the mathematical reality of complex biological systems. Specifically, they carefully identified genuine features of these systems, such as nonlinearities, nonlocality of effects, fractal aspects, and high dimensionality, and argued that they fundamentally violate some of the statistical assumptions implicitly underlying big-data analysis, like independence of events. They concluded that these discrepancies “may lead to false expectations and, at their nadir, even to dangerous social, economical and political manipulation.” To ameliorate the situation, the field of big-data analysis would need new strong theorems characterizing the validity of its methods and the numbers of data required for obtaining reliable insights. Succi and Coveney go as far as stating that too many data are just as bad as insufficient data [ 42 ].

While philosophical doubts regarding inductive methods will always persist, one cannot deny that -omics-based, high-throughput studies, combined with machine learning and big-data analysis, have been very successful [ 43 ]. Yes, induction cannot truly reveal general laws, no matter how large the datasets, but they do provide insights that are very different from what science had offered before and may at least suggest novel patterns, trends, or principles. As a case in point, if many transcriptomic studies indicate that a particular gene set is involved in certain classes of phenomena, there is probably some truth to the observation, even though it is not mathematically provable. Kepler’s laws of astronomy were arguably derived solely from inductive reasoning [ 34 ].

Notwithstanding the opposing views on inductive methods, successful strategies shape how we think about science. Thus, to take advantage of all experimental options while ensuring quality of research, we must not allow that “anything goes” but instead identify and characterize standard operating procedures and controls that render this emerging scientific method valid and reproducible. A laudable step in this direction was the wide acceptance of “minimum information about a microarray experiment” (MIAME) standards for microarray experiments [ 44 ].

A third dimension of the scientific method: Allochthonous reasoning

Parallel to the blossoming of molecular biology and the rapid rise in the power and availability of computing in the late 20th century, the use of mathematical and computational models became increasingly recognized as relevant and beneficial for understanding biological phenomena. Indeed, mathematical models eventually achieved cornerstone status in the new field of computational systems biology.

Mathematical modeling has been used as a tool of biological analysis for a long time [ 27 , 45 – 48 ]. Interesting for the discussion here is that the use of mathematical and computational modeling in biology follows a scientific approach that is distinctly different from the traditional and the data-driven methods, because it is distributed over two entirely separate domains of knowledge. One consists of the biological reality of DNA, elephants, and roses, whereas the other is the world of mathematics, which is governed by numbers, symbols, theorems, and abstract work protocols. Because the ways of thinking—and even the languages—are different in these two realms, I suggest calling this type of knowledge acquisition “allochthonous” (literally Greek: in or from a “piece of land different from where one is at home”; one could perhaps translate it into modern lingo as “outside one’s comfort zone”). De facto, most allochthonous reasoning in biology presently refers to mathematics and computing, but one might also consider, for instance, the application of methods from linguistics in the analysis of DNA sequences or proteins [ 49 ].

One could argue that biologists have employed “models” for a long time, for instance, in the form of “model organisms,” cell lines, or in vitro experiments, which more or less faithfully reflect features of the organisms of true interest but are easier to manipulate. However, this type of biological model use is rather different from allochthonous reasoning, as it does not leave the realm of biology and uses the same language and often similar methodologies.

A brief discussion of three experiences from our lab may illustrate the benefits of allochthonous reasoning. (1) In a case study of renal cell carcinoma, a dynamic model was able to explain an observed yet nonintuitive metabolic profile in terms of the enzymatic reaction steps that had been altered during the disease [ 50 ]. (2) A transcriptome analysis had identified several genes as displaying significantly different expression patterns during malaria infection in comparison to the state of health. Considered by themselves and focusing solely on genes coding for specific enzymes of purine metabolism, the findings showed patterns that did not make sense. However, integrating the changes in a dynamic model revealed that purine metabolism globally shifted, in response to malaria, from guanine compounds to adenine, inosine, and hypoxanthine [ 51 ]. (3) Data capturing the dynamics of malaria parasites suggested growth rates that were biologically impossible. Speculation regarding possible explanations led to the hypothesis that many parasite-harboring red blood cells might “hide” from circulation and therewith from detection in the blood stream. While experimental testing of the feasibility of the hypothesis would have been expensive, a dynamic model confirmed that such a concealment mechanism could indeed quantitatively explain the apparently very high growth rates [ 52 ]. In all three cases, the insights gained inductively from computational modeling would have been difficult to obtain purely with experimental laboratory methods. Purely deductive allochthonous reasoning is the ultimate goal of the search for design and operating principles [ 53 – 55 ], which strives to explain why certain structures or functions are employed by nature time and again. An example is a linear metabolic pathway, in which feedback inhibition is essentially always exerted on the first step [ 56 , 57 ]. This generality allows the deduction that a so far unstudied linear pathway is most likely (or even certain to be) inhibited at the first step. Not strictly deductive—but rather abductive—was a study in our lab in which we analyzed time series data with a mathematical model that allowed us to infer the most likely regulatory structure of a metabolic pathway [ 58 , 59 ].

A typical allochthonous investigation begins in the realm of biology with the formulation of a hypothesis ( Fig 3 ). Instead of testing this hypothesis with laboratory experiments, the system encompassing the hypothesis is moved into the realm of mathematics. This move requires two sets of ingredients. One set consists of the simplification and abstraction of the biological system: Any distracting details that seem unrelated to the hypothesis and its context are omitted or represented collectively with other details. This simplification step carries the greatest risk of the entire modeling approach, as omission of seemingly negligible but, in truth, important details can easily lead to wrong results. The second set of ingredients consists of correspondence rules that translate every biological component or process into the language of mathematics [ 60 , 61 ].

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007279.g003.jpg

This mathematical and computational approach is distributed over two realms, which are connected by correspondence rules.

Once the system is translated, it has become an entirely mathematical construct that can be analyzed purely with mathematical and computational means. The results of this analysis are also strictly mathematical. They typically consist of values of variables, magnitudes of processes, sensitivity patterns, signs of eigenvalues, or qualitative features like the onset of oscillations or the potential for limit cycles. Correspondence rules are used again to move these results back into the realm of biology. As an example, the mathematical result that “two eigenvalues have positive real parts” does not make much sense to many biologists, whereas the interpretation that “the system is not stable at the steady state in question” is readily explained. New biological insights may lead to new hypotheses, which are tested either by experiments or by returning once more to the realm of mathematics. The model design, diagnosis, refinements, and validation consist of several phases, which have been discussed widely in the biomathematical literature. Importantly, each iteration of a typical modeling analysis consists of a move from the biological to the mathematical realm and back.

The reasoning within the realm of mathematics is often deductive, in the form of an Aristotelian syllogism, such as the well-known “All men are mortal; Socrates is a man; therefore, Socrates is mortal.” However, the reasoning may also be inductive, as it is the case with large-scale Monte-Carlo simulations that generate arbitrarily many “observations,” although they cannot reveal universal principles or theorems. An example is a simulation randomly drawing numbers in an attempt to show that every real number has an inverse. The simulation will always attest to this hypothesis but fail to discover the truth because it will never randomly draw 0. Generically, computational models may be considered sets of hypotheses, formulated as equations or as algorithms that reflect our perception of a complex system [ 27 ].

Impact of the multidimensional scientific method on learning

Almost all we know in biology has come from observation, experimentation, and interpretation. The traditional scientific method not only offered clear guidance for this knowledge gathering, but it also fundamentally shaped the way we think about the exploration of nature. When presented with a new research question, scientists were trained to think immediately in terms of hypotheses and alternatives, pondering the best feasible ways of testing them, and designing in their minds strong controls that would limit the effects of known or unknown confounders. Shaped by the rigidity of this ever-repeating process, our thinking became trained to move forward one well-planned step at a time. This modus operandi was rigid and exact. It also minimized the erroneous pursuit of long speculative lines of thought, because every step required testing before a new hypothesis was formed. While effective, the process was also very slow and driven by ingenuity—as well as bias—on the scientist’s part. This bias was sometimes a hindrance to necessary paradigm shifts [ 22 ].

High-throughput data generation, big-data analysis, and mathematical-computational modeling changed all that within a few decades. In particular, the acceptance of inductive principles and of the allochthonous use of nonbiological strategies to answer biological questions created an unprecedented mix of successes and chaos. To the horror of traditionalists, the importance of hypotheses became minimized, and the suggestion spread that the data would speak for themselves [ 36 ]. Importantly, within this fog of “anything goes,” the fundamental question arose how to determine whether an experiment was valid.

Because agreed-upon operating procedures affect research progress and interpretation, thinking, teaching, and sharing of results, this question requires a deconvolution of scientific strategies. Here I proffer that the single scientific method of the past should be expanded toward a vector space of scientific methods, with spanning vectors that correspond to different dimensions of the scientific method ( Fig 4 ).

An external file that holds a picture, illustration, etc.
Object name is pcbi.1007279.g004.jpg

The traditional hypothesis-based deductive scientific method is expanded into a 3D space that allows for synergistic blends of methods that include data-mining–inspired, inductive knowledge acquisition, and mathematical model-based, allochthonous reasoning.

Obviously, all three dimensions have their advantages and drawbacks. The traditional, hypothesis-driven deductive method is philosophically “clean,” except that it is confounded by preconceptions and assumptions. The data-mining–inspired inductive method cannot offer universal truths but helps us explore very large spaces of factors that contribute to a phenomenon. Allochthonous, model-based reasoning can be performed mentally, with paper and pencil, through rigorous analysis, or with a host of computational methods that are precise and disprovable [ 27 ]. At the same time, they are incomparable faster, cheaper, and much more comprehensive than experiments in molecular biology. This reduction in cost and time, and the increase in coverage, may eventually have far-reaching consequences, as we can already fathom from much of modern physics.

Due to its long history, the traditional dimension of the scientific method is supported by clear and very strong standard operating procedures. Similarly, strong procedures need to be developed for the other two dimensions. The MIAME rules for microarray analysis provide an excellent example [ 44 ]. On the mathematical modeling front, no such rules are generally accepted yet, but trends toward them seem to emerge at the horizon. For instance, it seems to be becoming common practice to include sensitivity analyses in typical modeling studies and to assess the identifiability or sloppiness of ensembles of parameter combinations that fit a given dataset well [ 62 , 63 ].

From a philosophical point of view, it seems unlikely that objections against inductive reasoning will disappear. However, instead of pitting hypothesis-based deductive reasoning against inductivism, it seems more beneficial to determine how the different methods can be synergistically blended ( cf . [ 18 , 27 , 34 , 42 ]) as linear combinations of the three vectors of knowledge acquisition ( Fig 4 ). It is at this point unclear to what degree the identified three dimensions are truly independent of each other, whether additional dimensions should be added [ 24 ], or whether the different versions could be amalgamated into a single scientific method [ 18 ], especially if it is loosely defined as a form of critical thinking [ 8 ]. Nobel Laureate Percy Bridgman even concluded that “science is what scientists do, and there are as many scientific methods as there are individual scientists” [ 8 , 64 ].

Combinations of the three spanning vectors of the scientific method have been emerging for some time. Many biologists already use inductive high-throughput methods to develop specific hypotheses that are subsequently tested with deductive or further inductive methods [ 34 , 65 ]. In terms of including mathematical modeling, physics and geology have been leading the way for a long time, often by beginning an investigation in theory, before any actual experiment is performed. It will benefit biology to look into this strategy and to develop best practices of allochthonous reasoning.

The blending of methods may take quite different shapes. Early on, Ideker and colleagues [ 65 ] proposed an integrated experimental approach for pathway analysis that offered a glimpse of new experimental strategies within the space of scientific methods. In a similar vein, Covert and colleagues [ 66 ] included computational methods into such an integrated approach. Additional examples of blended analyses in systems biology can be seen in other works, such as [ 43 , 67 – 73 ]. Generically, it is often beneficial to start with big data, determine patterns in associations and correlations, then switch to the mathematical realm in order to filter out spurious correlations in a high-throughput fashion. If this procedure is executed in an iterative manner, the “surviving” associations have an increased level of confidence and are good candidates for further experimental or computational testing (personal communication from S. Chandrasekaran).

If each component of a blended scientific method follows strict, commonly agreed guidelines, “linear combinations” within the 3D space can also be checked objectively, per deconvolution. In addition, guidelines for synergistic blends of component procedures should be developed. If we carefully monitor such blends, time will presumably indicate which method is best for which task and how the different approaches optimally inform each other. For instance, it will be interesting to study whether there is an optimal sequence of experiments along the three axes for a particular class of tasks. Big-data analysis together with inductive reasoning might be optimal for creating initial hypotheses and possibly refuting wrong speculations (“we had thought this gene would be involved, but apparently it isn’t”). If the logic of an emerging hypotheses can be tested with mathematical and computational tools, it will almost certainly be faster and cheaper than an immediate launch into wet-lab experimentation. It is also likely that mathematical reasoning will be able to refute some apparently feasible hypothesis and suggest amendments. Ultimately, the “surviving” hypotheses must still be tested for validity through conventional experiments. Deconvolving current practices and optimizing the combination of methods within the 3D or higher-dimensional space of scientific methods will likely result in better planning of experiments and in synergistic blends of approaches that have the potential capacity of addressing some of the grand challenges in biology.

Acknowledgments

The author is very grateful to Dr. Sriram Chandrasekaran and Ms. Carla Kumbale for superb suggestions and invaluable feedback.

Funding Statement

This work was supported in part by grants from the National Science Foundation ( https://www.nsf.gov/div/index.jsp?div=MCB ) grant NSF-MCB-1517588 (PI: EOV), NSF-MCB-1615373 (PI: Diana Downs) and the National Institute of Environmental Health Sciences ( https://www.niehs.nih.gov/ ) grant NIH-2P30ES019776-05 (PI: Carmen Marsit). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Creative Thinking vs. Critical Thinking

What's the difference.

Creative thinking and critical thinking are two distinct but equally important cognitive processes. Creative thinking involves generating new ideas, concepts, and solutions by exploring various possibilities and thinking outside the box. It encourages imagination, originality, and innovation. On the other hand, critical thinking involves analyzing, evaluating, and questioning ideas, arguments, and information to make informed decisions and judgments. It emphasizes logical reasoning, evidence-based thinking, and the ability to identify biases and fallacies. While creative thinking focuses on generating ideas, critical thinking focuses on evaluating and refining those ideas. Both thinking processes are essential for problem-solving, decision-making, and personal growth.

Further Detail

Introduction.

Creative thinking and critical thinking are two distinct cognitive processes that play crucial roles in problem-solving, decision-making, and innovation. While they share some similarities, they also have distinct attributes that set them apart. In this article, we will explore the characteristics of creative thinking and critical thinking, highlighting their differences and showcasing how they complement each other in various contexts.

Creative Thinking

Creative thinking is a cognitive process that involves generating new ideas, concepts, or solutions by exploring possibilities, making connections, and thinking outside the box. It is characterized by originality, flexibility, and fluency of thought. Creative thinkers often challenge conventional wisdom, embrace ambiguity, and are open to taking risks. They are adept at finding alternative perspectives and exploring multiple solutions to problems.

One of the key attributes of creative thinking is the ability to think divergently. This means being able to generate a wide range of ideas or possibilities, often through brainstorming or free association. Creative thinkers are not limited by constraints and are willing to explore unconventional or unorthodox approaches to problem-solving.

Another important aspect of creative thinking is the ability to make connections between seemingly unrelated concepts or ideas. This skill, known as associative thinking, allows creative thinkers to draw upon a diverse range of knowledge and experiences to generate innovative solutions. They can see patterns, analogies, and relationships that others may overlook.

Furthermore, creative thinking involves the willingness to take risks and embrace failure as a learning opportunity. Creative thinkers understand that not all ideas will be successful, but they are not deterred by setbacks. They view failures as stepping stones towards finding the right solution and are persistent in their pursuit of innovative ideas.

In summary, creative thinking is characterized by divergent thinking, associative thinking, risk-taking, and persistence. It encourages the exploration of new ideas and unconventional approaches to problem-solving.

Critical Thinking

Critical thinking, on the other hand, is a cognitive process that involves analyzing, evaluating, and interpreting information to form reasoned judgments or decisions. It is characterized by logical, systematic, and objective thinking. Critical thinkers are skilled at identifying biases, assumptions, and fallacies in arguments, and they strive to make well-informed and rational decisions based on evidence.

One of the key attributes of critical thinking is the ability to think analytically. Critical thinkers break down complex problems or situations into smaller components, examine the relationships between them, and evaluate the evidence or information available. They are adept at identifying logical inconsistencies or flaws in reasoning, which helps them make sound judgments.

Another important aspect of critical thinking is the ability to evaluate information objectively. Critical thinkers are skeptical and question the validity and reliability of sources. They seek evidence, consider alternative viewpoints, and weigh the strengths and weaknesses of different arguments before forming their own opinions. This attribute is particularly valuable in today's information-rich society, where misinformation and biased narratives are prevalent.

Furthermore, critical thinking involves the ability to think systematically. Critical thinkers follow a logical and structured approach to problem-solving, ensuring that all relevant factors are considered. They are skilled at identifying assumptions, clarifying concepts, and drawing logical conclusions based on the available evidence. This systematic approach helps minimize errors and biases in decision-making.

In summary, critical thinking is characterized by analytical thinking, objective evaluation, skepticism, and systematic reasoning. It emphasizes the importance of evidence-based decision-making and helps individuals navigate complex and information-rich environments.

Complementary Attributes

While creative thinking and critical thinking have distinct attributes, they are not mutually exclusive. In fact, they often complement each other and can be seen as two sides of the same coin.

Creative thinking can benefit from critical thinking by providing a framework for evaluating and refining ideas. Critical thinking helps creative thinkers assess the feasibility, viability, and desirability of their innovative ideas. It allows them to identify potential flaws, consider alternative perspectives, and make informed decisions about which ideas to pursue further.

On the other hand, critical thinking can benefit from creative thinking by expanding the range of possibilities and solutions. Creative thinking encourages critical thinkers to explore unconventional approaches, challenge assumptions, and consider alternative viewpoints. It helps them break free from rigid thinking patterns and discover innovative solutions to complex problems.

Moreover, both creative thinking and critical thinking require open-mindedness and a willingness to embrace ambiguity. They both involve a certain level of discomfort and uncertainty, as individuals venture into uncharted territories of thought. By combining creative and critical thinking, individuals can develop a well-rounded cognitive toolkit that enables them to tackle a wide range of challenges.

Creative thinking and critical thinking are two distinct cognitive processes that bring unique attributes to problem-solving, decision-making, and innovation. Creative thinking emphasizes divergent thinking, associative thinking, risk-taking, and persistence, while critical thinking emphasizes analytical thinking, objective evaluation, skepticism, and systematic reasoning.

While they have their differences, creative thinking and critical thinking are not mutually exclusive. They complement each other and can be seen as two sides of the same coin. Creative thinking benefits from critical thinking by providing a framework for evaluation and refinement, while critical thinking benefits from creative thinking by expanding the range of possibilities and solutions.

By cultivating both creative and critical thinking skills, individuals can enhance their ability to navigate complex problems, make well-informed decisions, and drive innovation in various domains. These cognitive processes are not only valuable in academic and professional settings but also in everyday life, where the ability to think creatively and critically can lead to personal growth and success.

Comparisons may contain inaccurate information about people, places, or facts. Please report any issues.

IMAGES

  1. PPT

    what's the difference between critical thinking and the scientific method

  2. Critical Thinking Definition, Skills, and Examples

    what's the difference between critical thinking and the scientific method

  3. 6 Main Types of Critical Thinking Skills (With Examples)

    what's the difference between critical thinking and the scientific method

  4. 😀 Critical thinking and. How to Think Critically and Problem Solve

    what's the difference between critical thinking and the scientific method

  5. PPT

    what's the difference between critical thinking and the scientific method

  6. Critical_Thinking_Skills_Diagram_svg

    what's the difference between critical thinking and the scientific method

VIDEO

  1. 10 Fascinating Facts About Albert Einstein

  2. Critical thinking and deferring to experts

  3. Critical Thinking versus General Thinking by Dr Muhammad Ilyas Khan, Hazara University

  4. Creative Thinking VS Critical Thinking

  5. Critical Thinking versus Overthinking

  6. What is the Difference between Critical Path and Critical Chain?

COMMENTS

  1. The Relationship Between Scientific Method & Critical Thinking

    Critical thinking initiates the act of hypothesis. In the scientific method, the hypothesis is the initial supposition, or theoretical claim about the world, based on questions and observations. If critical thinking asks the question, then the hypothesis is the best attempt at the time to answer the question using observable phenomenon.

  2. Understanding the Complex Relationship between Critical Thinking and

    The purpose of this study was to better understand the relationship between students' critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

  3. Critical Thinking and Scientific Thinking

    Critical thinkers prioritize objectivity to analyze a problem, deduce logical solutions, and examine what the ramifications of those solutions are. While scientific thinking often relies heavily on critical thinking, scientific inquiry is more dedicated to acquiring knowledge rather than mere abstraction. There are a lot of nuances between ...

  4. Science, method and critical thinking

    The method, based on critical thinking, is embedded in the scientific method, named here the Critical Generative Method. Before illustrating the key requirements for critical thinking, one point must be made clear from the outset: thinking involves using language, and the depth of thought is directly related to the 'active' vocabulary ...

  5. Scientific Method

    Science is an enormously successful human enterprise. The study of scientific method is the attempt to discern the activities by which that success is achieved. Among the activities often identified as characteristic of science are systematic observation and experimentation, inductive and deductive reasoning, and the formation and testing of ...

  6. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  7. Understanding the Complex Relationship between Critical Thinking and

    Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career ... examples of how critical thinking relates to the scientific method (Miri et al., 2007). In these examples, the important connection between writ-

  8. Understanding the Complex Relationship between Critical Thinking and

    Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students' development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in ...

  9. Science and the Spectrum of Critical Thinking

    Both the scientific method and critical thinking are applications of logic and related forms of rationality that date to the Ancient Greeks. The full spectrum of critical/rational thinking includes logic, informal logic, and systemic or analytic thinking. This common core is shared by the natural sciences and other domains of inquiry share, and ...

  10. Use the Scientific Method

    The scientific method is characterized by the following: The systematic search for errors. The consideration of potential biases before, during, and after the experiment. The ability to reproduce the experiment. While this approach is used in the "hard" and human sciences, it can also be applied to critical thinking and uncovering the truth.

  11. Scientific Thinking and Reasoning

    Abstract. Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting ...

  12. 1.5: The Scientific Method

    This page titled 1.5: The Scientific Method is shared under a CC BY-NC-SA license and was authored, remixed, and/or curated by Noah Levin ( NGE Far Press) . The procedure that scientists use is also a standard form of argument. Its conclusions only give you the likelihood or the probability that something is true (if your theory or hypothesis ...

  13. Scientific Thinking and Critical Thinking in Science Education

    Once the differences, common aspects, and relationships between critical thinking and scientific thinking have been discussed, it would be relevant to establish some type of specific proposal to foster them in science classes. Table 5 includes a possible script to address various skills or processes of both types of thinking in an integrated ...

  14. 1: Introduction to Critical Thinking, Reasoning, and Logic

    1.7: Creating a Philosophical Outline. This page titled 1: Introduction to Critical Thinking, Reasoning, and Logic is shared under a license and was authored, remixed, and/or curated by () . What is thinking? It may seem strange to begin a logic textbook with this question. 'Thinking' is perhaps the most intimate and personal thing that ...

  15. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  16. Science, method and critical thinking

    scientific method is to try and answer those. The way in which questions emerge is a subject in itself. This is not addressed here, but this should also be the subject of critical thinking (Yanai & Lercher, 2019). The basis for scientific investigation accepts that, while the truth of the world exists in itself ('relativism'

  17. Conceptual review on scientific reasoning and scientific thinking

    Introduction. As part of high-order thinking processes, Scientific Reasoning (SR) and Scientific Thinking (ST) are concepts of great relevance for psychology and educational disciplines (Kuhn, 2009 ). The relevance of these concepts resides in two levels. First, the level of ontogenetical development (Zimmerman, 2007) reflected in the early ...

  18. Scientific method

    The scientific method is critical to the development of scientific theories, which explain empirical (experiential) laws in a scientifically rational manner.In a typical application of the scientific method, a researcher develops a hypothesis, tests it through various means, and then modifies the hypothesis on the basis of the outcome of the tests and experiments.

  19. What Are The Steps Of The Scientific Method?

    The scientific method is a process that includes several steps: First, an observation or question arises about a phenomenon. Then a hypothesis is formulated to explain the phenomenon, which is used to make predictions about other related occurrences or to predict the results of new observations quantitatively. Finally, these predictions are put to the test through experiments or further ...

  20. CRITICAL THINKING, THE SCIENTIFIC METHOD

    Because the scientific method is just a formalization of critical thinking, that means that the students become critical thinkers. And that is what I most want to teach. The basic idea: Explicitly discussing the logic and the thought processes that inform experimental methods works better than hoping students will "get it" if they hear enough ...

  21. Perspective: Dimensions of the scientific method

    The scientific method has been guiding biological research for a long time. It not only prescribes the order and types of activities that give a scientific study validity and a stamp of approval but also has substantially shaped how we collectively think about the endeavor of investigating nature. The advent of high-throughput data generation ...

  22. Scientific method

    The scientific method is an empirical method for acquiring knowledge that has characterized the development of science since at ... careful collection of measurements or counts of relevant quantities is often the critical difference between pseudo-sciences, such as alchemy, and science, such as chemistry or biology. Scientific measurements are ...

  23. Creative Thinking vs. Critical Thinking

    It emphasizes logical reasoning, evidence-based thinking, and the ability to identify biases and fallacies. While creative thinking focuses on generating ideas, critical thinking focuses on evaluating and refining those ideas. Both thinking processes are essential for problem-solving, decision-making, and personal growth.