• Job Practice Tests Prep Experts
  • All The Tests
  • Free Cubiks Logiks Intermediate Course
  • Free Numerical Reasoning Course
  • Free Verbal Reasoning Course
  • Free Inductive Reasoning Course
  • Free Diagrammatic Reasoning Course
  • Error Checking Course
  • Birkman Assessment Tests
  • Breke Assessment
  • Caliper Assessment
  • CAPP Assessment Tests
  • CEB SHL Tests
  • Cubiks Tests
  • Cut-E Assessment
  • Hogan Personality Inventory Test
  • IBM Kenexa Tests
  • Kenexa Prove It Test
  • Pearson Assessment
  • Pearson TalentLens Tests
  • PeopleAnswers Assessment Tests
  • Prevue Assessment
  • Saville Swift Analysis Aptitude
  • Siena Reasoning Test
  • Talent Q Tests
  • Wonderlic Tests
  • Numerical Reasoning Test
  • Inductive Reasoning
  • Error Checking
  • Analytical Reasoning Test
  • Number Sequence Test
  • Diagrammatic Reasoning Test
  • Electronic Data Processing Test
  • Raven Progressive Matrices
  • Venn Diagram Test
  • Watson Glaser Critical Thinking Test
  • The California Critical Thinking Skills Test
  • Wiesen Test of Mechanical Aptitude
  • Bennet Test of Mechanical Comprehension
  • Ramsay Mechanical Aptitude Test
  • Spatial Reasoning Test
  • Situational Judgement Test
  • Logical Reasoning
  • Group Exercises
  • Role Play Exercises
  • In-Tray Exercises
  • Microsoft Excel Test
  • Microsoft Word Test
  • Firefighter Exam
  • 911 Dispatcher Test
  • Federal Civil Service Exams
  • Federal Air Marshal Exam
  • Fire Department In-Basket Exercise
  • USA Border Patrol Entrance Exam
  • Transportation Security Administration Test
  • Probation Officer Exam
  • NSA Online Test
  • NCJOSI Tests
  • Walmart Assessment Test
  • Deloitte Assessment Test
  • Exxon Mobil Assessment Test
  • 473 Postal Services Aptitude Exam
  • Administrative Assistant Assessment Test
  • Air Traffic Skill Assessment Test
  • Call Center Assessment Test
  • Clerical Aptitude Tests
  • Job & Tests Forum
  • Ask A Question

Aptitude Tests

First things first.

Getting ready with your application for that job and have been asked to take the CCTST? Or perhaps you are a student and the CCTST gives you extra credit? Yet, you have no idea on how to prepare for it. Not to worry, here is a comprehensive guide explaining the CCTST and how to prepare for the same.

What is the California Critical Thinking Skills Test?

The California Critical Thinking Skills test is a discipline-neutral test that is used to evaluate the reasoning abilities of candidates. It has been used in all kinds of industries to test graduates of their aptitudeand logical thinking capabilities before shortlisting them for the next step in the recruitment process.

How does the test work?

The CCTST is designed to test whether a test-taker would demonstrate the critical and logical thinking skills required to solve the problems and in effect show real-world problem-solving abilities. The test has the following features.

  • Multiple choice components are mandatory and use everyday scenarios specific to the test-taker group. The candidate is required to completely understand the question, draw an inference and choose whichever option is the best suited response to the question.
  • The test items are different based on the test-taker group.
  • In an education setting, this test is administered to evaluate the candidates enrolling for a specific program, such as a Bachelor’s degree. It is also used to advise individual students, for program evaluation, accreditation, research and assessment of learning outcomes.
  • In a workplace setting, this test is used to evaluate an applicant’s reasoning skills in an effort to understand if the said candidate has the ability to fulfill the expectations of the organization.
  • The test is usually conducted for 45-50 minutes and the questions are largely dependent on the test-taker group.
  • The items on the CCTST are drawn, either at random or picked by the examiners, from a pool of items that were used to test candidates over the past 20 years.

How difficult is the California Critical Thinking Skills Test?

The test consists of around 34 multiple choice questions to be solved within 45 minutes or so, that means you would have around 1.3 minutes per question. The CCTST has questions that you would find on most reasoning tests.

What is the test passing score?

The California Critical Thinking Skills Test is measured on an array of scales. These are –Analysis, Inference, Deduction, Induction, Evaluation and overall reasoning skills. The test provides a score of each of these skills. Most organizations have a cut-off score (many require a score of 13) for being selected.

What kind of abilities or knowledge do you need to pass this test?

As initially stated, this exam requires critical thinking and problem-solving abilities. They differ based on the industry you are in. It’s highly advisable to take up practice exams numerous times to be prepared for the actual exam. Your very first attempt would show you where you lack in critical thinking abilities.

If you are unable to solve a problem, read the solution first and then retrace it backwards. With this technique, you would learn to look at problems with a fresh perspective.

And finally, practice makes you perfect. Solve many practice exams with a timer and soon you would be surprised at how fast you solve these problems.

How important is the CCTST for your evaluation?

In the United States, many organizations not only insist upon clearing this exam but on scoring high as well to advance in the recruitment process. Weakness in critical thinking would lead to failure to learn, confused communication, ineffective enforcement of rules, etc. Hence, employers and educationists make it mandatory for candidates to improve upon their reasoning abilities.

This test is a standardized measure of reasoning abilities and has been used for decades by many organizations, both worldwide and in the United States. Hence, if this is a part of your assessment, be it for a job or an educational program, it’s very important that you score well on this test.

To sum things up

it is imperative that you take up the CCTST for adding more value to your resume, and in the process of preparation, develop your critical thinking skills.

Secondary Menu

California critical thinking skills test (cctst), instructions.

Please click here to view detailed instructions for accessing and completing the California Critical Thinking Skills Test (CCTST)

If you have any questions about the study or the CCTST questionnaire, you are welcome to contact Dr. Jennifer Hill, Director of the Office of Assessment, at (919) 668-1617 or  [email protected]

Click here for instructions to begin the California Critical Thinking Skills Test (CCTST)

  • Gen. Ed. Assessment
  • Starting the DAP
  • Handbook on Assessment at Duke
  • Assessment Resources
  • Presentations
  • Duke's Quality Enhancement Plan (QEP)

Critical Thinking test

By 123test team . Updated May 12, 2023

Critical Thinking test reviews

This Critical Thinking test measures your ability to think critically and draw logical conclusions based on written information. Critical Thinking tests are often used in job assessments in the legal sector to assess a candidate's  analytical critical  thinking skills. A well known example of a critical thinking test is the Watson-Glaser Critical Thinking Appraisal .

Need more practice?

Score higher on your critical thinking test.

The test comprises of the following five sections with a total of 10 questions:

  • Analysing Arguments
  • Assumptions
  • Interpreting Information

Instructions Critical Thinking test

Each question presents one or more paragraphs of text and a question about the information in the text. It's your job to figure out which of the options is the correct answer.

Below is a statement that is followed by an argument. You should consider this argument to be true. It is then up to you to determine whether the argument is strong or weak. Do not let your personal opinion about the statement play a role in your evaluation of the argument.

Statement: It would be good if people would eat vegetarian more often. Argument: No, because dairy also requires animals to be kept that will have to be eaten again later.

Is this a strong or weak argument?

Strong argument Weak argument

Statement: Germany should no longer use the euro as its currency Argument: No, because that means that the 10 billion Deutschmark that the introduction of the euro has cost is money thrown away.

Overfishing is the phenomenon that too much fish is caught in a certain area, which leads to the disappearance of the fish species in that area. This trend can only be reversed by means of catch reduction measures. These must therefore be introduced and enforced.

Assumption: The disappearance of fish species in areas of the oceans is undesirable.

Is the assumption made from the text?

Assumption is made Assumption is not made

As a company, we strive for satisfied customers. That's why from now on we're going to keep track of how quickly our help desk employees pick up the phone. Our goal is for that phone to ring for a maximum of 20 seconds.

Assumption: The company has tools or ways to measure how quickly help desk employees pick up the phone.

  • All reptiles lay eggs
  • All reptiles are vertebrates
  • All snakes are reptiles
  • All vertebrates have brains
  • Some reptiles hatch their eggs themselves
  • Most reptiles have two lungs
  • Many snakes only have one lung
  • Cobras are poisonous snakes
  • All reptiles are animals

Conclusion: Some snakes hatch their eggs themselves.

Does the conclusion follow the statements?

Conclusion follows Conclusion does not follow

(Continue with the statements from question 5.)

Conclusion: Some animals that lay eggs only have one lung.

In the famous 1971 Stanford experiment, 24 normal, healthy male students were randomly assigned as 'guards' (12) or 'prisoners' (12). The guards were given a uniform and instructed to keep order, but not to use force. The prisoners were given prison uniforms. Soon after the start of the experiment, the guards made up all kinds of sentences for the prisoners. Insurgents were shot down with a fire extinguisher and public undressing or solitary confinement was also a punishment. The aggression of the guards became stronger as the experiment progressed. At one point, the abuses took place at night, because the guards thought that the researchers were not watching. It turned out that some guards also had fun treating the prisoners very cruelly. For example, prisoners got a bag over their heads and were chained to their ankles. Originally, the experiment would last 14 days. However, after six days the experiment was stopped.

The students who took part in the research did not expect to react the way they did in such a situation.

To what extent is this conclusion true, based on the given text?

True Probably true More information required Probably false False

(Continue with the text from 'Stanford experiment' in question 7.)

The results of the experiment support the claim that every young man (or at least some young men) is capable of turning into a sadist fairly quickly.

  • A flag is a tribute to the nation and should therefore not be hung outside at night. Hoisting the flag therefore happens at sunrise, bringing it down at sunset. Only when a country flag is illuminated by spotlights on both sides, it may remain hanging after sunset. There is a simple rule of thumb for the time of bringing down the flag. This is the moment when there is no longer any visible difference between the individual colors of the flag.
  • A flag may not touch the ground.
  • On the Dutch flag, unless entitled to do so, no decorations or other additions should be made. Also the use of a flag purely for decoration should be avoided. However, flag cloth may be used for decoration - for example in the form of drapes.
  • The orange pennant is only used on birthdays of members of the Royal House and on King's Day. The orange pennant should be as long or slightly longer than the diagonal of the flag.

Conclusion: One can assume that no Dutch flag will fly at government buildings at night, unless it is illuminated by spotlights on both sides.

Does the conclusion follow, based on the given text?

(Continue with the text from 'Dutch flag protocol' in question 9.)

Conclusion: If the protocol is followed, the orange pennant will always be longer than the horizontal bands/stripes of the flag.

Please answer the questions below. Not all questions are required but it will help us improve this test.

My educational level is

-- please select -- primary school high school college university PhD other

Instruments

California critical thinking skills tests (cctst), description.

The California Critical Thinking Skills Tests (CCTST)  is an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do.  The CCTST is designed to engage the test-taker's reasoning skills.  The CCTST tests range in number of items, though all are presented in multiple choice format, and all use everyday scenarios appropriate to the intended test-taker group.  Items range in difficulty and complexity.  

The CCTST is not one test, but a dynamic family of tests - different versions for different age levels or professional fields.  These include:

  • California Critical Thinking Skills Test   and   California Critical Thinking Skills Test - Numeracy
  • Business Critical Thinking Skills Test   and   Business Critical Thinking Skills Test - Numeracy
  • Business Reasoning Test
  • CCTST M-Series Children and Youth:   CCTST MIB (grades 3-5) ,   CCST M20 (grades 6-9),   and  CCTST M25 (grades 6-9)
  • Health Science Reasoning Test   and   Health Science Reasoning Test - Numeracy
  • Test of Everyday Reasoning   and   Test of Everyday Reasoning - Numeracy
  • Legal Studies Reasoning Test
  • Military & Defense Critical Thinking Inventory
  • CCST CCT-G835

The CCTST has been proven to predict strength in critical thinking in authentic problem situations and success on professional licensure examinations.  In educational settings, the CCTST is recommended for evaluating program applicants, advising individual students, learning outcomes assessment, program evaluation, accreditation and research.

The link provides access to purchase of the CCTST as well as its documentation.

Authors provide instrument validity and/or reliability information.

All resources

STELAR is not the author of these materials and cannot provide information on validity or permission for use. Permissions must be requested through the publisher or authors listed below.

INSTRUMENT DETAILS

california critical thinking skills test free

California Critical Thinking Skills Test

CCTST Family of Tests measures critical thinking skills

Get data for Admissions Evaluating Critical Thinking Skills   Student Success Advising National Benchmarking Accreditation Educational Research

The  California Critical Thinking Skills Test (CCTST)  is an educational assessment that measures all the core reasoning skills needed for reflective decision-making. The CCTST provides valid and reliable data on critical thinking skills of individuals and of groups.  It is designed for use with undergraduate and graduate students. It is available in many languages and its OVERALL skills score can be benchmarked using one of many percentile comparisons. Clients most commonly use the  CCTST for admissions, advising and retention, studies of curriculum effectiveness, accreditation, and the documentation of student learning outcomes.

For assessment specs, administration, metrics reported, and more, scroll down. Contact us by using the “Request A Quote” button to ask a question. Or phone us at 650-697-5628 to speak with an assessment services client support specialist.

Seamless Testing. Results You Can Trust.

Higher education.

CCTST is calibrated for undergraduate and graduate level college students across the full spectrum of disciplines and fields of study.

Administration

Administered online with a secure, multi-lingual interface, it’s user-friendly and accessible anywhere.

Support Materials

User Manual includes all needed information about administering the assessment and interpreting the resulting individual and group scores.

Assessment Specs

55 minutes timed administration; 40 engaging, scenario-based questions

Deliverables

Group graphics with statistical summary of scores; Excel spreadsheet of responses to all custom demographic questions, and all scores for each person tested. Optional individual score reports for administrators and/or test takers.

Results Reported

Metrics include scores for 8 critical skills, plus an OVERALL rating. Population percentile scores are available for benchmarking.

All of the CCTST metrics are on a 100-point scale with a corresponding qualitative rating (Superior, Strong, Moderate, Weak, Not Manifested).

Available in English, Arabic, Chinese Simplified, Chinese Traditional, Dutch, French, German, Indonesian-Bahasa, Italian, Japanese, Korean, Norwegian, Portuguese, Spanish, Swedish, Thai, Turkish, and Vietnamese languages.

CCTST provides 8  cognitive skill scores to focus future development and training. Items are drawn from a scientifically developed and tested item pool.

  • OVERALL Critical Thinking Skills – Sustained use of critical thinking to form reasoned judgments
  • Analysis  – Accurate identification of the problem and decision-critical elements
  • Interpretation  – Discovering and determining significance and contextual meaning
  • Inference –  Drawing warranted and logical conclusions from reasons and evidence
  • Evaluation  – Assessing credibility of claims and the strength of arguments
  • Explanation  – Providing the evidence, reasons, assumptions, or rationale for judgments and decisions
  • Induction  – Reasoned judgment in ambiguous, risky, and uncertain contexts
  • Deduction  – Reasoned judgment in precisely defined, logically rigorous contexts
  • Numeracy  – Sustained use of critical thinking skills in quantitative contexts (quantitative reasoning)

The  California Critical Thinking Skills Test (CCTST)  Report Package includes an individual test-taker report for each person assessed and group summary reports for each group and sub-group in the sample.

Reports are generated immediately after the conclusion of testing and are available for clients to download making real time assessment possible. Read more about how our  customer support specialists work with clients to select their reporting options on our Services tab or contact us for a consultation.

Group Analytics

  • Clients can generate and download Excel spreadsheet files of all scores (OVERALL, Percentile ranking and all cognitive score metrics). At the option of the client, these also include the responses to custom demographic questions added by the client to the assessment profile, and percentile scores corresponding to the external comparison group selected by the client.
  • Presentation-ready tables and graphic representations of the score distribution for OVERALL critical thinking skills and for the additional cognitive skill metrics.
  • Customers who have added custom demographic questions can generate sub-group reports for these variables, or for specific testing sessions or time periods.

Optional Individual Test-Taker Reports

  • An overall score of critical thinking skills (OVERALL Score). OVERALL is reported on a 100-point scale accompanied by a qualitative rating (Superior, Strong, Moderate, Weak, Not Manifested), and a comparison percentile score.
  • Scores for each cognitive skill metric. These metrics are scored on a 100-point scale and are accompanied by a categorical interpretation of the strength of the score indicating areas of strength and areas for future development.
  • The Individual Test Taker Report can be pushed to an email address of the client’s choosing (for example, to an admissions office email, institutional assessment email, dean’s office email, etc.).
  • The client controls whether individual reports are made available to the test-taker.

Need to expedite your project?  We can have your first online testing assignment available for your students within 24 hours.  Request a Quote or get started by calling 650-697-5628 and speaking with one of our assessment specialists today.

Unlock your exclusive access to our resource library, training tools.

 Understand the depth of our metrics with hands-on tools designed to elucidate the reasoning behind our results. These tools empower you to interpret and apply our data in your professional journey.

Published Articles

Explore how our products are used in real-world scenarios, with comprehensive studies that showcase their impact and effectiveness.

Analytical Reports

Delve into critical thinking insights from our founders, offering fresh perspectives and groundbreaking approaches.

Your subscription includes access to our resource library and periodic emails that keep you informed and ahead in your field. Your privacy is important to us. We promise to keep your information safe and never spam you. You can unsubscribe at any time.

Attention! Your ePaper is waiting for publication!

By publishing your document, the content will be optimally indexed by Google via AI and sorted into the right category for over 500 million ePaper readers on YUMPU.

This will ensure high visibility and many readers!

illustration

Your ePaper is now published and live on YUMPU!

You can find your publication here:

Share your interactive ePaper on all platforms and on your website with our embed function

illustration

California Critical Thinking Skills Test

cctst_cctst-n_cct-g835_user_manual_216

volkovevg

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>California</strong><br />

<strong>Critical</strong> <strong>Thinking</strong><br />

<strong>Skills</strong> <strong>Test</strong><br />

The Gold Standard <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong>s for<br />

Undergraduate and Graduate Programs<br />

User Manual and Resource Guide<br />

Published by Insight Assessment, a Division of the <strong>California</strong> Academic Press<br />

Phone: (650) 697-5628<br />

www.insightassessment.com<br />

© 2016 <strong>California</strong> Academic Press, San Jose, CA. All rights reserved worldwide.<br />

1 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

reserved worldwide.

Willing and Able to Think <strong>Critical</strong>ly?<br />

Assess <strong>Critical</strong> <strong>Thinking</strong> Mindset with the CCTDI<br />

and <strong>Skills</strong> with the CCTST<br />

<strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> Disposition Inventory<br />

Companion to the<br />

<strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong><br />

2 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

CCTST-N<br />

CCTST<br />

CCT-G835<br />

Technical consultation is available through Insight Assessment.<br />

Phone 650.697.5628<br />

Fax 650.692.0141<br />

3 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Ethics of Performance <strong>Test</strong>ing<br />

Use, administration, scoring, and interpretation of the assessment tools published by the <strong>California</strong> Academic Press<br />

is the sole responsibility of the purchaser and user. Insight Assessment strongly recommends that persons using these<br />

testing tools observe ethical, academic, and professional standards in the use, administration, scoring, and<br />

interpretation of these instruments.<br />

Many professional associations issue guidelines regarding ethical practices in educational testing. One might consult,<br />

for example, “Principles of Good Practice for Assessing Student Learning,” issued in December 1992 under the<br />

auspices of the American Association of Higher Education, One DuPont Circle, Suite 360, Washington DC, 20036;<br />

“Ethical Standards of the American Educational Research Association,” issued in October 1992, AERA, 1230 17th<br />

Street, NW, Washington DC 20036; and “Standards for Educational and Psychological <strong>Test</strong>ing,” issued in 1985 by the<br />

American Psychological Association Inc., also at 1200 17th Street, NW, Washington DC, 20036.<br />

Priority of the Current Update<br />

This update supersedes all earlier versions of this assessment manual with or without ISBN numbers and all<br />

informational materials as may have been published on the Internet or in any other form or media by Insight<br />

Assessment / the <strong>California</strong> Academic Press regarding the assessment instrument(s) supported by this manual. In<br />

the event of discrepancies or inconsistencies between any earlier version of this manual or any other materials<br />

published by Insight Assessment / the <strong>California</strong> Academic Press and the current edition of this assessment manual,<br />

the information in the current edition of this manual should be given priority.<br />

Complimentary Update<br />

All Insight Assessment customers in good standing who have purchased single use licenses for the assessment<br />

instrument(s) this user manual supports are invited to request a complimentary PDF of the most updated version of<br />

this user manual at any time. To receive your updated copy of this manual, phone Insight Assessment at 650-697-<br />

5628 or email us at [email protected]<br />

User Manual Editor and Project Director: Dee August<br />

Content Management and Web Coordination: Kathryn Winterhalter<br />

Design: James Morante<br />

Images, Citations and Photos Credits: In addition to text and images developed by Insight Assessment, the<br />

assessment authors, or in the public domain, we acknowledge with gratitude the citations, photos and images that<br />

appear in this manual from Microsoft Exchange, Wikipedia Commons, and Measured Reasons LLC.<br />

© 2016 by Insight Assessment / <strong>California</strong> Academic Press, San Jose, CA, 95112. All rights reserved. Printed in the<br />

United States of America. This document is protected by Copyright. Contact Insight Assessment to seek permission<br />

prior to reproducing or transmitting this publication of this assessment manual in whole or in part.<br />

4 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

CCTST, CCTST-N, & CCT-G835<br />

Table of Contents<br />

Section 1: <strong>Critical</strong> <strong>Thinking</strong><br />

The opening theoretical section provides an overview of the assessment instrument and the core construct, “critical<br />

thinking.” Specifics about the instrument and the scores it reports are introduced. This section emphasizes the<br />

importance of critical thinking. Without the skills and the mindset to reason reflectively about their problems and<br />

decisions, individuals and communities significantly reduce their chances of survival and success.<br />

.<br />

The CCTST and “<strong>Critical</strong> <strong>Thinking</strong>” 8<br />

<strong>Test</strong>ing in Languages Other than English 9<br />

Results Reporting and Score Array 11<br />

Scores Reported 12<br />

Score Descriptions 13<br />

Versions that Include Numeracy 14<br />

Why Measure Quantitative Reasoning (Numeracy)? 15<br />

Why Measure <strong>Critical</strong> <strong>Thinking</strong>? 16<br />

The APA Delphi Consensus Definition of <strong>Critical</strong> <strong>Thinking</strong> 18<br />

The Importance of Being Willing and Able to Think Well 19<br />

Section 2: Administration Options<br />

This practical section describes the wide selection of comprehensive and customizable options available to you for<br />

your data collection. Whether you are planning to administer the CCTST through a browser, LMS, mobile device, or<br />

in paper and pencil format, our staff will explain use of the design options to help you to tailor data collection to<br />

your project.<br />

Purposes and Projects 24<br />

Assessing Individuals 24<br />

Admissions, Advising, Proficiency <strong>Test</strong>ing, Intern or Student Placement,<br />

Professional Development, Hiring 24<br />

Assessing Groups 25<br />

Cohort Assessment, Outcomes Assessment, Program Efficacy,<br />

Group Proficiency, Staff Development 26<br />

Preliminary Considerations 26<br />

Choose the Right <strong>Test</strong> 26<br />

Collect the Most Informative Data 27<br />

Examples of Sampling Design 27<br />

Motivate People to Give Their Best Effort 28<br />

Consider Your <strong>Test</strong> Administration Option(s) 29<br />

Learning Management System (LMS) Administration 29<br />

Online Administration 30<br />

Getting Started 30<br />

Checklist for Hands-On Online Assessment Administration 31<br />

Proctor Instructions: Online Assessment Administration 34<br />

Paper-and-Pencil Administration 35<br />

Getting Started 35<br />

5 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Proctor Instructions: Pencil-and-Paper Administration 37<br />

Scoring Information - CapScore 38<br />

Checklist for Preparing CapScore Response Form: 39<br />

Section 3: Results Reported<br />

This section presents a step by step guide to the interpretation of the scores reported for this instrument.<br />

Assessment reporting formats include charts, statistical tables, spreadsheets, and individual reports.<br />

Interpreting Score Reports 40<br />

Interpreting Individual <strong>Test</strong> taker Reports 40<br />

Interpreting Spreadsheet Score Reports 44<br />

Step 1: Interpret Each Individual’s OVERALL Score 46<br />

Step 2: Examine Individual Comparison Percentile Scores 46<br />

Step 3: Examine the Performance Assessment for OVERALL to<br />

Determine the Strength of the Scores 47<br />

Step 4: Examine the Performance Assessment of the Scale Scores 49<br />

Interpreting Group Score Reports 52<br />

The Table of Statistics and the Group Histogram 52<br />

Step 1: Interpret the Group’s Mean OVERALL Score 53<br />

Step 2: Examine the Mean of the Percentile Scores of the Group 54<br />

Step 3: Determine the Strength of the Mean OVERALL Score<br />

Using the Recommended Performance Assessments Table 55<br />

Step 4: Interpret the Scale Scores for this group of test takers 56<br />

Important Considerations when Analyzing Score Reports 59<br />

Difference Scores, Gains, Outliers, Discarding False <strong>Test</strong>s … 59<br />

Assessing the Effects of an Educational Program 61<br />

Section 4: Validity and Reliability<br />

This section provides important information relating to the validity and reliability of Insight Assessment’s<br />

instruments. Major topics include content, construct, criterion (predictive) validity, internal consistency, and testretest<br />

reliability. Included are hyperlinks to published research reports about the validity and reliability of the<br />

instruments.<br />

Content, Construct, and Criterion (Predictive) Validity 64<br />

Internal Consistency Reliability (KR-20, Cronbach’s Alpha) 68<br />

<strong>Test</strong>-Retest Reliability 69<br />

Published Evidence of Validity and Reliability – Research Reports 70<br />

Section 5: Resources<br />

This section provides some helpful information for teachers and trainers, and also conceptual information for those<br />

involved in developing learning outcomes assessment projects. If you are not reading this as a digital file, go to<br />

www.insightassessment.com/Resources to find all of these resources posted. We invite you to incorporate these<br />

links into program posts for educational purposes to help your trainees obtain the most up to date versions of these<br />

materials.<br />

Talking About <strong>Critical</strong> <strong>Thinking</strong> 72<br />

Teaching and Training Tools 74<br />

Research Findings 76<br />

Quotes about <strong>Thinking</strong> Courageously and Well 77<br />

6 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 6: Customer Relationship<br />

This section provides important legal messages and notifications pertaining to the use of Insight Assessment test<br />

instrument use licenses, including the fundamental agreement for the use of testing licenses, non-disclosure and<br />

non-compete agreement, buyer qualification, privacy, data security, instrument protection, disability<br />

accommodation, and copyrights.<br />

User Licensure Agreement 78<br />

Privacy Policy 80<br />

Data Security 81<br />

Instrument Protection 82<br />

Disability Accommodation 84<br />

Copyright Notices 85<br />

Tables and Figures<br />

Table 1A: Partial Spreadsheet Report of Individual Demographics (right side) 44<br />

Table 1B: Partial Spreadsheet Report of Individual Scores (left side) 45<br />

Table 2: Descriptions of Recommended Performance Assessments OVERALL Scores 47<br />

Table 3: Recommended Performance Assessments for the OVERALL Score 48<br />

Table 4: Recommended Performance Assessments: 34-Point CCTST Scale Scores 49<br />

Table 5: Example of Scale Score Interpretation 50<br />

Table 6: Recommended Performance Assessments: 100-Point CCTST Scale Scores 51<br />

Table 7: Recommended Performance Assessments CCT-G835 Scale Scores 51<br />

Table 8: Group Scores for XYZ University 52<br />

Table 9: (Reprinted Table 8) Group Scores for XYZ University 56<br />

Figure 1: Sample Individual <strong>Test</strong> Taker Report 42<br />

Figure 2: OVERALL Score Distribution for XYZ University - Undergraduate Sample 53<br />

Figure 3: Recommended Performance Assessments of the XYZ University Sample 55<br />

Figure 4: Distribution of Numeracy Scores for ABCD University 57<br />

Figure 5: Distributions of Scale Scores for QRST University 58<br />

Figure 6: Pretest and Posttest OVERALL Scores Comparison 60<br />

Figure 7: Difference Scores Comparing Pretest with Posttest Scores 63<br />

7 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 1:<br />

<strong>Critical</strong> <strong>Thinking</strong>:<br />

This opening theoretical section provides an overview of the assessment<br />

instrument and the core construct, “critical thinking.” Specifics about the<br />

instrument and the scores it reports are introduced. This section emphasizes<br />

the importance of critical thinking. Without the skills and the mindset to reason<br />

reflectively about their problems and decisions, individuals and communities<br />

significantly reduce their chances of survival and success.<br />

The CCTST and “<strong>Critical</strong> <strong>Thinking</strong>”<br />

The <strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong>, CCTST, in all of its many forms and versions, has been a premier instrument<br />

for measuring critical thinking skills for more than 25 years. The CCTST measures high-stakes reasoning and decision<br />

making processes. The CCTST (and associated instruments, such as the BCTST, HSRT, TER, BRT, MDCTI, LSRP, and<br />

TRAA) are based on the APA Delphi Consensus Definition of <strong>Critical</strong> <strong>Thinking</strong>, an interdisciplinary consensus<br />

that has been embraced worldwide as characterizing what is meant by the term ‘critical thinking.’ The item pool<br />

that supports these instruments has been refined through a continuing series of instrument development projects<br />

at varying educational levels, across varying disciplines, and in various languages, resulting in an array of critical<br />

thinking assessment instruments designed to validly and reliably assess candidates, trainees, students, and<br />

working professionals. There is an additional discussion of the APA research project at the end of this section. If this<br />

definition of critical thinking describes the skills you plan to assess, the CCTST will be an effective assessment<br />

instrument for your project.<br />

The CCTST and CCTST-N forms of this instrument are calibrated for<br />

Undergraduate and Graduate level students or comparable population<br />

groups. The CCT-G835 is calibrated to differentiate well in groups of<br />

doctoral level students and working professionals who have very strong<br />

educational preparation.<br />

Each skills test question requires that the test taker make an accurate and<br />

complete interpretation of the question, consider the information<br />

presented, and reason to the best option from among those provided. There<br />

are varying numbers of items on different tests within the CCTST family of<br />

reasoning skills tests, but in each case the length of the instrument and the time limit for test administration<br />

purposes are set to permit maximum performance within the range of possible effort for the intended test taker<br />

group.<br />

8 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

The validation studies of the first generic forms of the CCTST were<br />

conducted using a case control methodology in college-level institutions in<br />

<strong>California</strong>. These studies lead to the first generic version of the <strong>California</strong><br />

<strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong> (CCTST). Subsequently, the item pool has been<br />

greatly expanded and now supports the testing of critical thinking skills in<br />

persons from Grades 3 through doctoral level trainees.<br />

Items used in versions of the CCTST are continually refined for their ability<br />

to capture the reasoning process of test takers and to expose common<br />

human reasoning errors that result from weak critical thinking. All forms of<br />

the CCTST provide both an OVERALL Score for critical thinking and a selection of scale scores to assist the trainer or<br />

instructor to focus curricula and training opportunities to address particular weaknesses in both individuals and<br />

groups. Items contained in the CCTST (all forms) are tested for their ability to differentiate well between individuals<br />

when the items are taken as a group. The items use everyday scenarios, appropriate to the intended test taker group.<br />

The response frame is in multiple choice format. Any specialized information needed to respond correctly is provided<br />

in the question itself.<br />

The newest forms of the CCTST provide an OVERALL Score that is the best<br />

comprehensive measure of an individual’s critical thinking skills, seven<br />

specific skills scores, and when applicable, a comparison percentile to allow<br />

administrators to compare their sample’s scores with an external criterion.<br />

OVERALL Score and scale scores are qualitatively interpretable in terms of<br />

performance.<br />

The CCTST instrument development team includes experts in critical<br />

thinking, assessment, psychometrics and measurement, statistics, and<br />

decision science. Continuing research on the CCTST focuses on the valid and<br />

reliable measurement of critical thinking skills at all levels of educational and occupational expertise. Specialized<br />

forms of the CCTST use item stems that have the context of the professional workplace targeted by the instrument.<br />

To assure that these contexts would be appropriately engaging for test takers, the development of these measures<br />

also involved consultation with working professionals in each of the professional areas.<br />

<strong>Test</strong>ing in Languages Other than English<br />

To obtain a valid measure of reasoning skills, it is necessary that there be no language or cultural barrier to<br />

understanding assessment items. Translations of the CCTST and related instruments are conducted using a rigorous<br />

process that addresses both of these issues. 1<br />

<br />

We specialize in high quality culturally-appropriate translations of our copyright-protected, researchbased<br />

tools. Insight Assessment researchers work with professional translators and with native<br />

speakers to develop valid and reliable translations. Translations undergo rigorous review and validation<br />

in the field. Authorized translations are currently available for global use. Additional translations can<br />

be made available by special request.<br />

1<br />

If the objective is to assess how well an individual can demonstrate evidence of critical thinking skills while communicating in a<br />

second language (for instance, English) the use of an English language assessment instrument is appropriate, but the assessment<br />

score will be potentially altered in relationship to English language comprehension.<br />

9 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

We also provide independent language flexibility in the test taker interface (TTI). The test taker can<br />

select from a variety of languages on the interface to assure they understand navigation and test<br />

instructions. It is important to us and our customers to only assess the individual’s thinking skills and<br />

attitudes, not their ESL abilities.<br />

<strong>Test</strong> taker reports can be delivered in multiple languages. Contact us to discuss your specific needs.<br />

Our multicultural capabilities go beyond the basic language to include many country-specific variants,<br />

date formatting, number formatting, image selection, proper names, symbol and color selection.<br />

Cultural sensitivity is vital in the assessment of thinking.<br />

The CCTST is Available in:<br />

Arabic<br />

Chinese Simplified<br />

Chinese Traditional<br />

Dutch<br />

English<br />

French<br />

German<br />

Hebrew<br />

Indonesian<br />

Italian<br />

Korean<br />

Norwegian<br />

Portuguese<br />

Spanish<br />

Thai<br />

Turkish<br />

Vietnamese<br />

Authorized translations of the CCTST are<br />

available in many languages. Each<br />

authorized translation is the product of a<br />

collaborative effort between the<br />

instrument development team and an<br />

international scholar. The development<br />

and authorization of new language forms<br />

of the CCTST requires a validation study.<br />

Translation projects are underway which<br />

will expand the list seen here. Check our<br />

website for the most complete list of<br />

authorized translations.<br />

Visit our website for the most<br />

up-to-date list of available valid<br />

translations of the CCTST.<br />

As with all test instruments distributed<br />

by Insight Assessment, new language<br />

forms of the CCTST are authorized only<br />

when the new items and scales achieve<br />

psychometric performance standards.<br />

Scholars with interest in a possible<br />

translation project should consult the<br />

website for additional information.<br />

10 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Results Reporting and Score Array<br />

All versions of the <strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong> (e.g. the CCTST-N) provide a comprehensive array of scores<br />

that are valid and reliable indicators of strength in critical thinking skill (higher scores) and, in some cases, indicators<br />

of those at risk for poor performance in academic programs or workplace situations (scores that fail to manifest<br />

critical thinking skill). This score package includes an OVERALL Score, a corresponding percentile score that<br />

benchmarks the OVERALL Score against an external comparison group selected by the client (criterion-based score),<br />

and an array of scale scores that describe relative strengths and weaknesses in specific critical thinking skills.<br />

The CCTST Score Package includes charts and statistical tables describing the aggregated scores<br />

for the group, as well as a spreadsheet showing every score for each individual in the group.<br />

Consult Section 3 of this document for details on how to interpret the reported scores.<br />

Online test administration includes individual reports that can be made available to the test<br />

takers at the option of the client.<br />

11 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Scores Reported<br />

OVERALL Score is the most informative measure of an individual’s critical thinking skills. In order to receive a<br />

Superior or Strong OVERALL Score the individual must demonstrate skills in all of the cognitive skill areas associated<br />

with critical thinking.<br />

CCTST Scale Scores: <strong>Critical</strong> thinking is a holistic process, but different individuals and groups of individuals have<br />

been shown to have relative strengths and weaknesses in several easily addressed areas (described briefly below<br />

and in more detail in the Topics of Interest section of this test manual). The newest versions of the CCTST include<br />

the following scale scores identifying areas of strengths and relative weakness in cognitive skills associated with<br />

critical thinking:<br />

Analysis<br />

Interpretation<br />

Inference<br />

Evaluation<br />

Explanation<br />

Induction<br />

Deduction<br />

Numeracy (CCTST-N versions)<br />

Earlier versions of the CCTST and current paper-and-pencil versions of the CCTST provide an OVERALL Score, a<br />

percentile ranking and an array of five skill scale scores to inform test administrators of these relative strengths and<br />

weaknesses. The older versions do not break out Interpretation, Explanation, or Numeracy as separate scores.<br />

Recommended Performance Assessment: The CCTST OVERALL Score and all of the CCTST specific skill scores<br />

are qualitatively interpretable (for example they can be determined to be Superior, Strong, Moderate, Weak, or Not<br />

Manifested.) The specific modifiers used to interpret the quality of the scores are listed in the section of the manual<br />

entitled “Interpreting Score Reports.” These recommended performance assessments are based on both internal<br />

data analyses from available datasets as well as independent research reporting the relationship between scores<br />

and external performance variables (academic success, workplace transition, employer ratings).<br />

Comparison Percentiles: For more than twenty five years, Insight Assessment has made it possible for clients<br />

to confidently benchmark their assessment sample against student groups from K-12 through the highest level of<br />

university graduate programs and professional program students in Business and the Health Sciences. Our expert<br />

staff will assist you to select the most appropriate comparison groups available to benchmark your assessment<br />

sample scores to an external criterion. This is a scoring package option and clients may decline this option, as<br />

preferred.<br />

12 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Score Descriptions<br />

OVERALL: The reasoning skills OVERALL Score describes overall<br />

strength in using reasoning to form reflective judgments about<br />

what to believe or what to do. To score well overall, the test taker<br />

must excel in the sustained, focused and integrated application of<br />

core reasoning skills including analysis, interpretation, inference,<br />

evaluation, explanation, induction, and deduction. The OVERALL<br />

Score predicts the capacity for success in educational or workplace<br />

settings which demand reasoned decision making and thoughtful<br />

problem solving.<br />

ANALYSIS: Analytical reasoning skills enable people to identify assumptions, reasons, and claims, and to examine<br />

how they interact in the formation of arguments. We use analysis to gather information from charts, graphs,<br />

diagrams, spoken language, and documents. People with strong analytical skills attend to patterns and to details.<br />

They identify the elements of a situation and determine how those elements interact. Strong interpretation skills<br />

can support high-quality analysis by providing insights into the significance of what a person is saying or what<br />

something means.<br />

INTERPRETATION: Interpretative skills are used to determine the precise meaning and significance of a message<br />

or signal, whether it is a gesture, sign, set of data, written or spoken words, diagram, icon, chart, or graph. Correct<br />

interpretation depends on understanding the message in its context and in terms of who sent it, and for what<br />

purpose. Interpretation includes clarifying what something or someone means, grouping or categorizing<br />

information, and determining the significance of a message.<br />

INFERENCE: Inference skills enable us to draw conclusions from reasons and evidence. We use inference when we<br />

offer thoughtful suggestions and hypotheses. Inference skills indicate the necessary or the very probable<br />

consequences of a given set of facts and conditions. Conclusions, hypotheses, recommendations, or decisions that<br />

are based on faulty analyses, misinformation, bad data, or biased evaluations can turn out to be mistaken, even if<br />

they have been reached using excellent inference skills.<br />

EVALUATION: Evaluative reasoning skills enable us to assess the<br />

credibility of sources of information and the claims they make. We use<br />

these skills to determine the strength or weakness of<br />

arguments. Applying evaluation skills, we can judge the quality of<br />

analyses, interpretations, explanations, inferences, options, opinions,<br />

beliefs, ideas, proposals, and decisions. Strong explanation skills can<br />

support high-quality evaluation by providing the evidence, reasons,<br />

methods, criteria, or assumptions behind the claims made and the<br />

conclusions reached.<br />

EXPLANATION: Explanatory reasoning skills, when exercised prior to making a final decision about what to believe<br />

or what to do, enable us to describe the evidence, reasons, methods, assumptions, standards, or rationale for those<br />

decisions, opinions, beliefs, and conclusions. Strong explanatory skills enable people to discover, to test, and to<br />

articulate the reasons for beliefs, events, actions, and decisions.<br />

INDUCTION: Decision making in contexts of uncertainty relies on inductive reasoning. We use inductive reasoning<br />

skills when we draw inferences about what we think is probably true based on analogies, case studies, prior<br />

experience, statistical analyses, simulations, hypotheticals, and patterns recognized in familiar objects, events,<br />

experiences, and behaviors. As long as there is the possibility, however remote, that a highly probable conclusion<br />

13 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

might be mistaken even though the evidence at hand is unchanged, the reasoning is inductive. Although it does not<br />

yield certainty, inductive reasoning can provide a confident basis for solid belief in our conclusions and a reasonable<br />

basis for action.<br />

DEDUCTION: Decision making in precisely defined<br />

contexts where rules, operating conditions, core beliefs,<br />

values, policies, principles, procedures, and terminology<br />

completely determine the outcome depends on strong<br />

deductive reasoning skills. Deductive reasoning moves<br />

with exacting precision from the assumed truth of a set<br />

of beliefs to a conclusion that cannot be false if those<br />

beliefs are true. Deductive validity is rigorously logical<br />

and clear-cut. Deductive validity leaves no room for<br />

uncertainty, unless one alters the meanings of words or<br />

the grammar of the language.<br />

Versions that Include Numeracy<br />

Reasoning in mathematical contexts (Numeracy) is an important component of Twenty First Century education and<br />

a key skill for the STEM, health care, and business related programs and professional practice. The ability to interpret<br />

graphs and charts that express information numerically, to frame problems with attention to quantitative data, and<br />

to make judgments based on the analysis and evaluation of mathematical information are only a few examples of<br />

why it is valuable to assess critical thinking skills in the context of numeracy.<br />

The CCTST-N measures critical thinking skills and provides a separate measure of Numeracy. Numeracy is vital for<br />

success in today’s heavily quantitative academic and professional learning and decision making environments. The<br />

measure of numeracy included with adoption of the CCTST-N defines Numeracy as follows:<br />

NUMERACY: Numeracy skills are used when applying<br />

knowledge of numbers, arithmetic, measures, and<br />

mathematical techniques to situations that require the<br />

interpretation or evaluation of information. Numeracy refers<br />

to the ability to solve quantitative reasoning problems, or to<br />

make judgments derived from quantitative reasoning in a<br />

variety of contexts. More than being able to compute a<br />

solution to a mathematical equation, numeracy includes the<br />

understanding of how quantitative information is gathered,<br />

manipulated, and represented visually, such as in graphs,<br />

charts, tables, and diagrams.<br />

If you are currently using the CCTST and would like to move to<br />

a version of the instrument that includes a reported score for<br />

Numeracy, contact your Insight Assessment representative or<br />

send us a note at “Contact Us” www.Insightassessment.com.<br />

14 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Why Measure Quantitative Reasoning<br />

(Numeracy)?<br />

Numeracy is the ability to solve quantitative reasoning problems and to make well-reasoned judgments derived from<br />

quantitative information in a variety of contexts. More than being able to compute or calculate a solution to a<br />

mathematical equation, numeracy includes understanding how quantitative information is gathered, represented,<br />

and correctly interpreted using graphs, charts, tables, and diagrams. A person with strong numeracy skills can apply<br />

his or her knowledge of numbers, arithmetic, algebraic relationships, geometric relationships, and mathematical<br />

techniques to situations that require the interpretation or evaluation of quantitative information. The person with<br />

strong numeracy skills is able to recognize and use quantitative information, patterns, ratios, percentages, spatial<br />

relationships, and statistical information intelligently and correctly when drawing conclusions, making estimates,<br />

and explaining or predicting events or behavior. 2<br />

Strong numeracy skills distinguish successful business executives, managers, health care professionals, engineers,<br />

architects, scientists, real estate agents, sales professionals, financial analysts, and policy makers. Spreadsheets are<br />

the order of the day. Professionals in every field know that key decisions often depend on a thorough weighing of<br />

costs and benefits, accurate projections of likely outcomes, and the ability to interpret correctly the complex<br />

numerical relationships represented in tables, charts, graphs, blueprints, or diagrams.<br />

Numeracy is for everyone. From political polling data to the stats<br />

on the sports pages, from the economic news about stocks and<br />

interest rates, to the impact on our lives of the price of gas and food,<br />

our lives are awash in numerical data. What does an increase in the<br />

cost of living index or a decrease in the unemployment rate mean<br />

for me and my family? How important to my health is achieving a<br />

5% decrease in my risk of heart attack, my blood pressure, or my<br />

BMI? How much will it cost to earn a college degree and what<br />

impact would that degree have on my earning potential? If I put this<br />

purchase on my credit card, what will it actually cost me? How does<br />

a change in the tax code impact my take-home pay? 3<br />

The development of numeracy skills, like critical thinking skills, begins in childhood. Australia has identified<br />

numeracy as a national educational goal. That nation operationalizes numeracy for curricular purposes as<br />

“calculating and estimating, recognizing and using patterns, using fractions, decimals, ratios, rates and percentages,<br />

2<br />

Wiest, L. R., Higgins, H. J. & Frost, J. H. (2007). Quantitative literacy for social justice. Equity & Excellence in Education, 40(1), 47-<br />

55. Gittens, C. The disposition toward critical thinking: Math achievement is more than skill alone. 2012 manuscript submitted for peer<br />

reviewed publication. Salpeter, J. (2003). 21st century skills: Will our students be prepared? Technology & Learning, 24(3), 17-26.<br />

Steen, L. A. (1990). Numeracy. Daedalus, 19(2), Literacy in America (Spring 1990), 211-231. Steen, L. A. (1997). The new literacy.<br />

In L. Steen (Ed.) Why numbers count: Quantitative literacy for tomorrow’s America. Princeton, NJ: College Entrance Examination<br />

Board. Steen, L. A. (1999). Numeracy: The new literacy for a data-drenched society. Educational Leadership, 57(2), 8-13. Steen, L.<br />

A. (2000a). Reading, writing and numeracy. Liberal Education, 86(2), 26-37. Steen, L. A. (2000b). The case for quantitative literacy.<br />

Mathematics and Democracy: The Case for Quantitative Literacy. Princeton, NJ: National Council on Education and the Disciplines<br />

and Educational Reform Initiative. Woodrow Wilson National Fellowship Foundation.<br />

3<br />

Steen, L. A. (2000b). The case for quantitative literacy. Mathematics and Democracy: The Case for Quantitative Literacy. Princeton,<br />

NJ: National Council on Education and the Disciplines and Educational Reform Initiative. Woodrow Wilson National Fellowship<br />

Foundation.<br />

15 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

using spatial reasoning, interpreting and drawing conclusions from statistical information, and using measurement.” 4<br />

In the United States the Common Core State Standards Initiative, 2011 reform effort locates critical thinking about<br />

math as a central learning outcome at all grade levels. 5 <strong>Critical</strong> thinking applied to math focuses on mathematical<br />

problem solving, quantitative reasoning, argument construction, argument evaluation, structural analysis, and<br />

strategic application of tools to solve math problems, and modeling with mathematics. Numeracy skills can be<br />

thought of as the application of analysis, inference, interpretation, explanation, evaluation, as well as reflection on<br />

one’s own reasoning process (metacognition) to numerical and spatial information and relationships.<br />

Children, adolescents, and adults alike need to be able to think critically about<br />

the mathematical and numerical information that surrounds them in the<br />

media, on the Internet, in schools and workplaces, and in society at large. Dr.<br />

Carol Gittens points out “leading scholars and educators have consistently<br />

argued that numeracy rivals reading literacy and language fluency in its<br />

importance for learning and for life.” 6 Dr. Gittens notes that “numerically<br />

literate individuals understand the social and pragmatic function of<br />

mathematics and have the ability to reason about mathematical<br />

information.” Numeracy is essential in our data-driven world, if one hopes to<br />

be successful in the workplace, to achieve academically, to be engaged<br />

citizens, and to make thoughtful and well supported decisions in any domain of life that admits of the relevance of<br />

quantitative information. 7<br />

Why Measure <strong>Critical</strong> <strong>Thinking</strong>?<br />

Many believe it is obvious who the best thinkers are in a given agency or institution. But these impressions are often<br />

based on fortunate happenstance, expressions of self-confidence, and hierarchical position in the group, and<br />

hindsight. We can no longer afford to be mistaken about best thinking, when error rates are already in question,<br />

when difficult problems and decisions must be addressed, and where poor judgments can lead to irreparable<br />

damage and even cost lives.<br />

At all ages of life, wherever purposeful and reflective judgment is needed, critical thinking skills and mindset (habits<br />

of mind) are essential. Each of us makes judgments that affect ourselves, our families, our country, and our world.<br />

In all of these cases, when the stakes are high, critical thinking is vital. Learning demands strength in critical thinking<br />

because learning requires the interpretation and integration of new knowledge and its practical and appropriate<br />

application when encountering novel situations, problem conditions and innovative opportunities.<br />

4<br />

The Australian Curriculum and Assessment Reporting Authority. The Australian Curriculum v3.0 Numeracy – Organizing Elements.<br />

http://www.australiancurriculum.edu.au/generalcapabilities/numeracy/organising-elements/organising-elements<br />

5<br />

Common Core State Standards Initiatives (2011). Common Core Standards for Mathematics. http://www.corestandards.org/wpcontent/uploads/Math_Standards.pdf<br />

6<br />

Gittens, C. A. (2015). Assessing numeracy in the upper elementary and middle school years. Numeracy, Vol. 8: Iss. 1, Article DOI:<br />

http://dx.doi.org/10.5038/`936-4660.8.1.3. Available at: http://scholarcommons.usf.edu/numeracy/vol8/iss1/art3<br />

7<br />

Wilkins, J. L. M. (2000). Preparing for the 21st century: The status of quantitative literacy in the United States. School Science and<br />

Mathematics, 100(8), 405-418. Gittens and Wood, Predicting middle schoolers’ future math achievement: The power of an early<br />

measure of critical thinking. 2012 manuscript submitted for peer reviewed publication. Rivera-Batiz, F. L. (1992). Quantitative literacy<br />

and the likelihood of employment among young adults in the United States. Journal of Human Resources, 27(2), 313-328. Root, R.<br />

(2009). Social justice through quantitative literacy: A course connecting numeracy, engaged citizenship and a just society. Democracy<br />

& Education, 18(3), 37-43. Wiest, L. R., Higgins, H. J. & Frost, J. H. (2007). Quantitative literacy for social justice. Equity & Excellence<br />

in Education, 40(1), 47-55.<br />

16 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

<strong>Critical</strong> thinking is one of the key skills valued by employers, according to recent research. In 2013, for example, Hart<br />

Research Associates surveyed CEO’s and other C-suite executives at more than 300 private sector for-profit and nonprofit<br />

organizations. 8 93% agreed that, “a candidate’s demonstrated capacity to think critically, communicate clearly,<br />

and solve complex problems is more important than their undergraduate major.” In Robert Wood Johnson<br />

Foundation’s July 2009 Jobs to Careers, Randall Wilson wrote: “Early assessment of critical thinking maximizes<br />

workforce efficiency and increases the potential for learning and educational effectiveness at all levels.” The truth<br />

of this claim is even more apparent today. World culture and an information-intensive everyday life invite us to apply<br />

critical thinking to interpret, analyze, evaluate, explain, and draw warranted inferences about what to believe and<br />

what to do in a stream of novel and too often time-limited or high-stakes, uncertain situations. For the thinking<br />

process to be successful, it must be done with the habits of mind that have been identified as supporting strength<br />

in critical thinking. Studies, some of which are listed in the research section of the Insight Assessment website, have<br />

consistently shown that strength in critical thinking correlates with workplace and academic success, certification<br />

and licensure in the most valued professions, and survival of some of life’s most difficult challenges.<br />

We are learning more about how humans actually engage and try to understand problems and how people make<br />

judgments. Perhaps more importantly, we are learning more about how they make bad judgments, often without<br />

realizing it. When objective measures reveal weaknesses in reasoning skills and habits of mind, there are effective<br />

training and teaching techniques that can be used to strengthen those skills and to foster a more positive disposition<br />

toward thinking and reasoning. An honest and concerned appraisal of critical thinking skills and habits of mind<br />

manifested by working adults and students in all programs of study, together with a focused response to any<br />

demonstrated deficiencies, is the path to growth and achievement for individuals and for society as a whole.<br />

Weakness in critical thinking leads to…<br />

* failure to learn * confused and confounded communication *<br />

job loss * lost revenue* patient deaths * ineffective law<br />

enforcement * gullible voters * imprisonment * prejudice *<br />

higher combat casualties * upside down mortgages *<br />

thoughtless criticism * vehicular homicide * heart disease *<br />

unplanned pregnancies * financial mismanagement * family<br />

violence * repeated suicide attempts * drug addiction * …<br />

But training and assessing critical thinking to be sure of<br />

successful gains will lead to better outcomes…<br />

“I’m glad I thought that through!” vs. “What was I thinking?”<br />

“That was a really good decision!” vs. “That was a terrible decision!”<br />

“We fixed that problem, maybe just in time.” vs. “Why didn’t we<br />

address that when we had the chance?”<br />

8<br />

“IT TAKES MORE THAN A MAJOR: Employer Priorities for College Learning and Student Success,” Hart Research Notes, Hart<br />

Research Associates, 1724 Connecticut Avenue, NW, Washington, DC 20009. 318 employers were surveyed online. Each had 25 or<br />

more employees and indicated that more than 25% of their new hires held an academic degree, associates or baccalaureate.<br />

17 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Students and workers with weak critical thinking skills and mindset are not prepared to benefit from the educational<br />

training program that will be offered to them. Their presence in the classroom or laboratory causes instructors to<br />

slow or alter the training of other students and trainees. Their presence in clinics, internships, or field exercises risk<br />

an increase of injuries and liabilities related to likely errors of both inaction and wrong action. Unaddressed<br />

weakness in critical thinking skill results in loss of opportunities, of financial resources, of relationships, and even<br />

loss of life. There is probably no other attribute more worthy of measure than critical thinking.<br />

Human reasoning and problem solving are highly complex processes. Advances in the science of human cognition<br />

enable us to analyze, measure, and improve our human problem solving and decision making. A measure of critical<br />

thinking that describes an individual's comparative strength is a valuable aid in determining a person's capacity to<br />

benefit from training or to succeed in their job and in identifying which skills or habits of mind need attention.<br />

Today, educational programs and workplace training programs are being required to demonstrate that they are<br />

effectively improving critical thinking skills and mindset. Individual measures of core reasoning skills (i.e. analysis,<br />

interpretation, inference, evaluation, explanation, numeracy, inductive reasoning, and deductive reasoning) and<br />

mindset (Truth-seeking, open-mindedness, Analyticity, Systematicity, Confidence in Reasoning, Inquisitiveness, and<br />

Maturity of Judgment) provide valuable information about potential hires and guidance as to where to dedicate<br />

programs of improvement in workers and students.<br />

The APA Delphi Consensus Definition of <strong>Critical</strong> <strong>Thinking</strong><br />

<strong>Critical</strong> thinking is the process of purposeful, reflective judgment focused on deciding what to believe or what to<br />

do. <strong>Critical</strong> thinking is a pervasive human phenomenon. Many times each day we analyze information, interpret<br />

events and situations, evaluate claims, and evaluate the reasons offered in their support. Based on those analyses,<br />

interpretations, and evaluations we draw inferences and make reflective judgments about what to believe and what<br />

to do. These reflective judgments are the focus of critical thinking. 9<br />

Down through the millennia the great philosophers of cultural traditions throughout the world described that<br />

constellation of attitudes and attributes most closely associated with the eager desire to engage problems using thinking<br />

skills. More than 80 years ago, in How We<br />

Think, John Dewey expressed the<br />

significance of these habits of mind. 10 The<br />

practiced use of critical thinking as an<br />

approach to the important problems faced<br />

in one’s life and work situations requires<br />

the development of habits of mind that<br />

demand excellence in reflective judgment.<br />

In the late 1980's a foundational concept<br />

analysis to study was conducted to<br />

develop a consensus definition of critical<br />

thinking. This study is now referred to as<br />

Think <strong>Critical</strong>ly, Facione & Gittens,<br />

2016, Pearson Education, p14.<br />

9<br />

Facione, PA., “<strong>Critical</strong> <strong>Thinking</strong>: What it is and Why it Counts,” Go here for the latest update, or for Spanish or Chinese.<br />

10<br />

Dewey, J. (1933). "If we were compelled to make a choice between these personal attributes and knowledge about the principles of logical<br />

reasoning together with some degree of technical skill in manipulating special logical processes, we should decide for the former." (p.34)<br />

How We Think: A Restatement of the Relation of Reflective <strong>Thinking</strong> to the Educational Process. Lexington, MA: D.C. Heath.<br />

18 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

the APA Delphi Study of critical thinking. 11 The strength of the study was the use of the Delphi research method<br />

which allows the investigator to compile a consensus position without the undue influence of any one participant<br />

scholar or expert. The blinded, iterative conceptual clarification process that occurs with the Delphi approach<br />

permits the discussion of key questions and concepts, and the development of consensus, without introducing bias<br />

related to the prestige of the experts involved. The resulting consensus definition of critical thinking, now world<br />

famous, can be found in a variety of sources, and is also excerpted later in this section.<br />

An important component of the APA Delphi Study was the discussion of the dispositional side of the critical thinker.<br />

One must be disposed to think critically as well as have the skills to do so. The Delphi participants were a mixed<br />

disciplinary group, and among the participants were a cadre of philosophers who were well versed in the writings of<br />

the Greeks. Unlike those who took a cognitive science approach to the project, these philosophers contributed a<br />

definition of critical thinking that centered largely on the attributes of the person who demonstrated critical thinking.<br />

The emergence of the description of the ideal critical thinker and all of the submitted subtext led to the insight that<br />

it would perhaps be incomplete to measure thinking skills while leaving out the personality component.<br />

More than two decades of international research across disciplines affirms the importance of developing a strong<br />

critical thinking disposition. The CCTDI measures these attitudes and attributes, assessing the “willing” side of<br />

“willing and able to think well.” Its companion instruments, like the CCTST, measure the skills.<br />

Notice the clear attitudinal expectations captured in the quote in the text by Peter Facione and Carol Gittens 12 when<br />

they speak about the value of strong critical thinking. Of course the strong critical thinker needs to be able to analyze,<br />

infer, and explain with confidence what to believe and what to do in key judgment situations, but without the fairminded<br />

approach to inquiry that is the heart of critical thinking disposition, the exercise is too likely to fall short of<br />

the expected quality of judgment. In fact, the use of critical thinking may not even occur.<br />

No longer taken for granted as a byproduct of all educational processes, the training of critical thinking skills is<br />

becoming an increasing focus of professional preparation programs. The assurance of excellence in professional<br />

judgment is the result of the sound use of critical thinking skills and the reliable and strong disposition to use those<br />

critical thinking skills. The alternative (acting without adequate analysis of the problem, repeating a previous decision<br />

behavior unreflectively, or continuing to carry out a protocol or process without evaluating its effect) is not an<br />

acceptable quality standard in and professional workplace, nor does it bode well for life decisions in general.<br />

Equally necessary, is the training of habits of mind consistent with the dispositional side of strength in thinking. The<br />

CCTDI offers a way to assess the success of these training efforts.<br />

The Importance of Being Willing and Able to Think Well<br />

<strong>Critical</strong> thinkers must be both willing and able to think critically in the course of making decisions. It is possible to<br />

have strong critical thinking skills that are not being applied to decisions and problem solving. Possessing the<br />

requisite cognitive skills is necessary to being a good critical thinker, but so is being disposed to value and use those<br />

skills. The CCTST (<strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong>) is often administered in combination with the <strong>California</strong><br />

<strong>Critical</strong> <strong>Thinking</strong> Disposition Inventory (CCTDI) to assess both the critical thinking skills and the habits of mind in<br />

11<br />

The complete account of the American Philosophical Association (APA) Delphi Study and the derived consensus definition of critical<br />

thinking is reported in the document entitled, “<strong>Critical</strong> <strong>Thinking</strong>: A statement of Expert Consensus for Purposes of Educational<br />

Assessment and Instruction” first published as ERIC Doc. NO.: ED 315 423, 1990. It is republished in a variety of formats and available<br />

as a publication of Insight Assessment / The <strong>California</strong> Academic Press. The executive summary of this document is available as a<br />

free download on the website: www.insightassessment.com .<br />

12<br />

Facione, P & Gittens CA. THINK <strong>Critical</strong>ly. Pearson Education, Boston, MA, USA, 2011, 2013, 2016.<br />

19 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

students and practicing professionals. The MDCTI, LSRP, TRAA, and the INSIGHT Series of instruments all include a<br />

mindset part as well as a skills part.<br />

Before the CCTDI was available to measure thinking habits of mind, many assumed that strength in the disposition<br />

toward critical thinking would be strongly correlated with strength in critical thinking skills. Not true. Certainly many<br />

individuals are both disposed (willing) to address problems using critical thinking and skilled (able) to do so. Some<br />

individuals demonstrate skills, and yet are unwilling to use those skills to engage problems unless heavily prompted<br />

to do so. Others may be eager to address problems using thinking, but yet not possess the critical thinking skills to<br />

do so. And, of course, there are those who are neither willing<br />

nor able to think critically.<br />

The <strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> Disposition Inventory (CCTDI) is<br />

the premier critical thinking disposition instrument in the<br />

world today. Based on the Delphi Expert Consensus Definition<br />

of <strong>Critical</strong> <strong>Thinking</strong> 13, 14 description of the ideal critical thinker,<br />

this instrument has been used in decades of empirical and<br />

conceptual studies of human reasoning behavior. 15 The<br />

importance of the dispositional side of critical thinking was<br />

described by the Delphi experts in these terms:<br />

Cognitive skills are like physical skills -- through education, training,<br />

and exercise an individual can gain ever greater proficiency in their<br />

use. But the opportunities individuals have had during their lives to<br />

train, apply, and refine those skills may differ vastly. Opportunities<br />

to learn often go unused in those who do not have the habits of mind<br />

associated with self-motivated learning. Many people learn only<br />

what they think they need to learn to achieve their goals. Their<br />

judgment may be wrong about what is needed.<br />

Engaging problems and making decisions using critical thinking<br />

involves both skills and habits of mind. A strong critical thinker is one<br />

who is both disposed to think critically and has the skills to do so.<br />

13<br />

The American Philosophical Association. (1990) <strong>Critical</strong> <strong>Thinking</strong>: A Statement of Expert Consensus for Purposes of Educational<br />

Assessment and Instruction, ("The Delphi Report"). ERIC Doc. No. ED 315-423, pp. 80. [Executive summary including tables and<br />

recommendations (pp. 22) also available through Insight Assessment]<br />

14<br />

Appendices 1 and 2 include a full discussion of the research to define the construct of critical thinking skills and disposition measured<br />

by the CCTST family of test Instruments.<br />

15<br />

Scholars: Consult Section 5 of this document for a subset of these independent research studies. Also consult dissertation abstracts<br />

for completed dissertation studies.<br />

20 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Just as skills assessments measure core cognitive skills (abilities), disposition instruments measure attributes. These<br />

could also be called traits or mindset, and they can be used to describe a person in terms of their inclination to use<br />

critical thinking, in contrast to other strategies, when faced with problems to solve, ideas to evaluate, or decisions<br />

to make. Research indicates that the disposition toward critical thinking can be understood in terms of positive habits<br />

of mind. A person or group strongly disposed toward critical thinking is habitually truth-seeking, open-minded,<br />

analytical, systematic, inquisitive, confident in reasoning, and judicious.<br />

Characteristics of Strong <strong>Critical</strong> Thinkers<br />

From the APA Delphi Report<br />

‣ inquisitive with regard to a wide range of issues<br />

‣ concerned to become and remain well-informed<br />

‣ alert to opportunities to use critical thinking<br />

‣ trusting in the processes of reasoned inquiry<br />

‣ self-confident in their reasoning skills<br />

‣ open-minded regarding divergent world views<br />

‣ flexible when considering alternatives and opinions<br />

‣ understanding of the opinions of other people<br />

‣ fair-minded when appraising reasoning<br />

‣ honest in facing biases, prejudices, stereotypes, or egocentric tendencies<br />

‣ prudent in suspending, making, or altering judgments<br />

‣ willing to reconsider and to revise views where honest reflection suggests that change is<br />

warranted<br />

As indicated above, the APA Delphi panel of international experts defined “critical thinking” for purposes of training<br />

and measurement by emphasizing the reflective thinking process: “<strong>Critical</strong> thinking is the process of purposeful,<br />

self-regulatory judgment. This process gives reasoned consideration to evidence, context, conceptualizations,<br />

methods, and criteria.” This powerful two sentence definition is the heart of the American Philosophical Association<br />

Delphi Consensus. A very detailed and comprehensive definition of the skills and habits of mind associated with<br />

strength in critical thinking emerged from this multiyear study and was published in 1990 (ERIC Doc No ED 315 423<br />

1990).<br />

To this we would add, “<strong>Critical</strong> thinking is using this process of purposeful, reflective judgment to decide in a<br />

thoughtful, truth-seeking, and fair-minded way what to believe or what to do.” In the absence of critical thinking,<br />

one might simply follow the demands of authority, act without a full awareness of the situation, thoughtlessly do<br />

what has been done before, or do nothing when action is needed.<br />

The impetus for the Delphi study was an increasing tendency to use the term critical thinking to refer to any type of<br />

thinking associated with a positive outcome. Experts in the field were aware that building strength in critical thinking<br />

was not an automatic result of every educational offering. The APA Delphi study facilitated the discussion of experts<br />

from across the disciplines regarding the meaning of the term critical thinking and the thinking skills associated with<br />

this term. The study’s lead researcher (P. Facione), using the Delphi method developed by the Rand Corporation,<br />

obtained objective input regarding the definition of critical thinking from scholars across the disciplines who were<br />

blinded to the source of the input. Unexpectedly, the resulting consensus included both a description of the relevant<br />

thinking skills and a description of the mental disposition of someone regarded as having strength in critical thinking.<br />

21 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

This work subsequently provided a framework for a<br />

national discussion of the meaning and importance<br />

of critical thinking among employers, educators and<br />

policymakers. In this second, federally funded study<br />

spearheaded by Penn State University, the national<br />

sample of employers, educators, and policymakers<br />

endorsed both the description of critical thinking<br />

skills and the description of the ideal critical thinker<br />

(disposition) as what was needed in US workers,<br />

students, and leaders. Often referred to as the APA<br />

Delphi Consensus Definition of critical thinking, this<br />

consensus definition document has proven to be<br />

meaningful in educational institutions, government<br />

agencies, and business organizations around the<br />

world.<br />

Today, for some, the term is very closely associated with informal logic, for others an alternative way to describe<br />

scientific reasoning or rhetorical analysis, and for yet others it is a synonym for clinical reasoning or professional<br />

judgment. In all of these varying cases and differing disciplines, critical thinking is the major component of problem<br />

definition and reflective judgment processes across all contexts.16<br />

Many times each day, we analyze information, interpret events and situations, and evaluate claims and the reasons<br />

offered in their support. Based on those analyses, interpretations, and evaluations, we draw inferences and make<br />

reflective judgments about what to believe and what to do. These reflective judgments are the focus of critical<br />

thinking. The <strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong> measures these critical thinking skills, assessing the test taker’s<br />

strength in making reflective, reasoned judgments.<br />

The empirical analyses involved in the development of the<br />

companion measure of the critical thinking mindset, the<br />

<strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> Disposition Inventory (CCTDI),<br />

effectively reduced the APA Delphi Study discursive description<br />

of the ideal critical thinker from nineteen independent,<br />

descriptive phrases endorsed by consensus to seven<br />

dispositional constructs measurable by scales. In an important<br />

respect, the CCTDI refines and extends the conceptualization of<br />

critical thinking expressed in The Delphi Report. These<br />

constructs have since been endorsed as description of the<br />

attributes of someone who is a critical thinker by educators and<br />

a wide range of working professionals in the United States and<br />

in more than 40 countries around the world.<br />

16<br />

This section is a relatively short answer to the question: “What is meant by the term critical thinking?” If a more complete discussion is desired, refer<br />

to the Resources Section of this user manual.<br />

22 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

The APA Delphi Description of the Ideal <strong>Critical</strong> Thinker<br />

“The ideal critical thinker is habitually inquisitive, well-informed,<br />

honest in facing personal biases, prudent in making judgments,<br />

willing to reconsider, clear about issues, orderly in complex<br />

matters, diligent in seeking relevant information, reasonable in<br />

the selection of criteria, focused in inquiry, and persistent in<br />

seeking results which are as precise as the subject and the<br />

circumstances of inquiry permit.”<br />

“<strong>Critical</strong> <strong>Thinking</strong>: What It Is and Why It Counts,” Peter Facione, is an essay written for students, trainees,<br />

teachers, staff development educators, and the general public. This easy to read essay communicates the<br />

importance of critical thinking in all aspects of life. This essay has been translated into Chinese and Spanish. It is<br />

updated periodically to include new research on human reasoning. Many publications have included this essay and<br />

it is a link on many websites. A free download of the most recent version or for Spanish or Chinese<br />

translation, for purposes of education and educator training, can be found on our website.<br />

23 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 2:<br />

Administration Options<br />

This section describes the wide selection of comprehensive and<br />

customizable options available to you for your data collection. Whether<br />

you are planning to administer one or more IA assessment instruments<br />

through a browser, LMS, mobile device, or in paper and pencil format,<br />

our staff will explain use of the design options to help you to tailor data<br />

collection to your project.<br />

Purposes and Projects<br />

Assessing Individuals<br />

Individual test taker scores provide key information for<br />

advising and mentoring. Whether you are hiring a new<br />

employee or admitting a new student, an assessment of the<br />

personal strengths of the candidate helps to direct training<br />

resources and improves the likely success of the training<br />

program.<br />

Admissions – Example: When programs are in high<br />

demand and student retention is a factor in assuring<br />

that institutional goals are met, adding a measure of<br />

critical thinking to the admissions profile assists with<br />

identifying candidates who have the skills and the<br />

mindset to learn, to persist, and to succeed.<br />

Advising – Example: Many colleges dedicate resources to create and maintain teaching and learning<br />

centers to help all admitted students to succeed. Along with writing skills, reading skills and language<br />

comprehension, critical thinking is one of the central competencies that must be assessed to help advisors<br />

direct students in program and course selection.<br />

Competency or Individual Proficiency <strong>Test</strong>ing – Example: Training resources are scarce and often<br />

must be effectively directed only to those who require additional support. A threshold can be set to<br />

24 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

highlight acceptable strength in critical thinking for your program or industry. Individuals who fail to achieve<br />

that threshold will best benefit from attending training programs aimed at growing critical thinking skills.<br />

Intern and Student Placement – Example: Are your trainees ready for their educational experience?<br />

Research on effective teaching and learning and studies of successful transition or reentry to the workplace<br />

have demonstrated that learning experiences must be designed to challenge but not discourage the learner.<br />

If the training is too challenging, some candidates will hide their failures until errors become apparent.<br />

Scaffolding of new knowledge delivery and well-designed practice internships are proven methods for a<br />

successful professional transition if the candidate has the requisite critical thinking skills to perform in the<br />

work environment. A standard can be set as a criterion for the demonstration of readiness to enter<br />

internship or workplace environments.<br />

Professional Development – Example: every student, trainee or working professional can improve<br />

their critical thinking skills and must work on this goal. A measure of overall critical thinking skills,<br />

benchmarked against a national comparison group, helps the individual to make a realistic assessment of<br />

their strengths and weaknesses in critical thinking. Individual skills assessment informs them about what<br />

types of thinking exercises will be most beneficial to them in particular.<br />

Hiring – Example: Orienting new employees to work responsibilities is a necessity, but doing so for an<br />

employee who proves unable to perform the work in the long run is costly when resources are limited.<br />

Employees who have weak critical thinking skills are often the source of serious liability costs as well. Adding<br />

a measure of critical thinking skills to employee hiring procedures can provide the assurance that the new<br />

member of your working team will have the ability to interpret current practices and policies, accurately<br />

apply protocols and evaluate their effectiveness within their scope of work, draw warranted inferences<br />

about potential problem situations, and provide input toward quality improvements.<br />

Assessing Groups<br />

Training critical thinking skills begins with an accurate baseline assessment of group strengths and weaknesses and<br />

continues with an assessment demonstrating the outcomes and accomplishments resulting from the current training<br />

program, and perhaps a reassessment after a refinement of the training program curriculum or emphasis. Early<br />

assessments of groups provide collective diagnostics of their overall strengths and weaknesses and assist an<br />

educator or trainer to focus training efforts toward addressing gaps in the group overall.<br />

New Cohort Assessment – Example: <strong>Test</strong> incoming<br />

cohorts to learn about overall strength in critical thinking.<br />

Compare the group to national comparison percentiles for<br />

students and trainees in similar educational institutions or<br />

training programs. Examine average skills for individual<br />

critical thinking skill areas (analysis, inference, evaluation,<br />

inductive and deductive reasoning) to better understand<br />

the strengths of the group as a whole and to determine<br />

where to place emphasis in program objectives aimed at<br />

improving critical thinking skills.<br />

Outcomes Assessment – Example: Track groups over time in relationship to a focused effort to<br />

improve an educational program. Compare entry scores, perhaps gathered as a new cohort assessment,<br />

with exit scores to determine how well students have improved overall. Compare the national norm<br />

percentile of students and trainees entering your programs with those exiting the programs. Follow the<br />

25 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

growth in overall mean scores and the proportion of test takers in each recommended performance<br />

assessment level (critical thinking skills are strong, moderate or weak/not manifested) at the completion of<br />

the training program.<br />

Demonstrating the Quality of an<br />

Educational or Training Program –<br />

Example: Use a new cohort assessment in<br />

conjunction with an outcomes assessment<br />

measure to determine the magnitude of the<br />

training program’s effect on building critical<br />

thinking skills. This type of assessment is<br />

needed when agencies depend on the<br />

demonstration of quality to maintain funding or<br />

support for their programs. Other agencies<br />

seek to demonstrate quality in the training of<br />

critical thinking as a measure of their value to<br />

the community and to society at large.<br />

Demonstrating Group Proficiency – Example: Responding to accreditation guidelines, an<br />

educational institution compares the mean score for a representative group of graduating students against<br />

the national percentiles having determined that proficiency for its students will be demonstrated if their<br />

mean score is at or above the 50 th percentile.<br />

Staff Development – Example: Measure the relative strength of various employee groups to determine<br />

their relative skill in problem identification and problem solving. Determine the overall critical thinking<br />

strength of workplace teams and assess whether this demonstrated strength in critical thinking is adequate<br />

to workplace demands.<br />

Preliminary Considerations<br />

Choose the Right <strong>Test</strong><br />

Versions of the CCTST family of instruments: When thinking about measuring critical thinking, the first<br />

decision is whether you want to test for strength in thinking skills or measure the mindset (dispositions) that<br />

motivates a person to apply his or her skills. This is the test manual for the CCTST, a test of critical thinking skills. The<br />

next decision is which form of the test fits your test taker: This choice depends on the age and educational level of<br />

the test taker. This test manual accompanies your purchase of the CCTST designed for adult test takers of all ages. If<br />

you are testing children or using one of the specialized forms of the CCTST that is designed for Business, Health<br />

Sciences, Law, Community and Technical Colleges, Secondary Schools, Military Studies, or other specialty forms of<br />

the CCTST, contact Insight Assessment regarding a replacement test manual designed to accompany those forms of<br />

the test. Employers seeking to evaluate the core reasoning skills and related attributes of job applicants and current<br />

employees from the support staff to the executive levels are advised to consider the INSIGHT series.<br />

Accurate measurement of critical thinking skills requires that the test be calibrated to fit the likely skill range of the<br />

planned test taker group. This version of the CCTST is designed to measure critical thinking skills in adults who are<br />

attending colleges and universities for either undergraduate or Masters level educational programs. Other forms of<br />

the CCTST are designed and recommended for individuals who are in elementary through secondary (in the USA<br />

26 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

these are referred to as K-12 programs), or for those enrolled in doctoral level programs. There is also a<br />

recommended form of the CCTST designed for community and technical colleges.<br />

Reading Level Considerations: To perform well on a critical thinking test, the test taker must be able to read<br />

the question scenario and answer choices and understand the question being asked. Some aspects of this process<br />

involve critical thinking skills of interpretation and analysis to a great extent. However, it is important that reading<br />

and language issues are not significant barriers for the test taker. With the exception of the CCT-G835, all <strong>California</strong><br />

adult level critical thinking skills tests are set at a Flesch-Kincaid reading grade level of 8.6 or lower. K-12 versions of<br />

the tests have Flesch-Kincaid reading levels well below the grade level of the intended test taker.<br />

Language Comprehension: Language is also a consideration when assigning a critical thinking test. Students<br />

or workers who are using other than their native language may have difficulty demonstrating their true critical<br />

thinking skill on the CCTST if their language comprehension is inadequate to interpret the question scenario and<br />

answer choices and understand the question being asked. There are many authorized translations of the CCTST<br />

available. Visit the “Translations” tab for a specific testing instrument to see its current list of authorized translations.<br />

Collect the Most Informative Data<br />

Sampling Decisions: Perhaps you are interested in testing everyone in your program or organization. In that<br />

case, discussions about who to test first may not be highly relevant. In most situations, however, there is some<br />

benefit to deciding where to begin with an assessment of critical thinking skills. Here are a variety of considerations<br />

about how to identify a first group of test takers for your assessment program.<br />

Examples of Sampling Design<br />

Admissions: Adding a measure of critical thinking to the information being gathered and considered for<br />

program admission is much like the design for hiring. Each program has a particular selectivity. Programs<br />

that have limited capacity tend to have a higher selectivity and would likely set the threshold for scores at<br />

a higher level than those that serve greater numbers of persons and are more interested in minimal<br />

thresholds of readiness for program participation. The recommended performance assessments for the<br />

OVERALL Scores are helpful to consider in determining a threshold score for your program admission<br />

purposes.<br />

Program Evaluation: The effectiveness of a training program is usually assessed by comparing scores of<br />

those entering the program with scores of the same students or employees when they exit the program. If<br />

entering cohorts are similar in most or all cases, it may be adequate to collect entry data only once or twice<br />

to establish that this is the case, and then to move to an exit only testing design.<br />

Demonstrating Outcomes that Meet an External Criterion: If the requirement of the program is that<br />

everyone or a specified proportion enrollees should achieve at least a particular level of critical thinking<br />

proficiency, and there is no concern to provide individual feedback or training to any given individual,<br />

testing at the end of the program is an appropriate design.<br />

Hiring: Two groups of test takers are often relevant: the applicant pool you are considering for an interview<br />

and strong employees who hold similar jobs in your organization. The range and average of the test scores<br />

of your current employees will help you set a threshold for scores you would prefer to have in applicants<br />

coming for interview.<br />

27 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Selecting an Administration Method: Instruments offered by Insight Assessment are offered digitally online<br />

and in paper-and-pencil format. Insight Assessment staff is available to consult with you regarding which of these<br />

administration methods is most appropriate or whether both may be needed in your setting.<br />

Proctoring <strong>Test</strong>ing: Insight Assessment’s online testing is protected by high-level security access and data<br />

encryption. The testing system permits maximum flexibility for scheduling test assignments, allowing them to be<br />

scheduled at whatever time is optimal. Proctored testing is the standard for assuring that tests are taken in an<br />

appropriate testing environment by the individual who is assigned to take the test. Whenever test results will be<br />

used to inform decision-making about the candidate / test taker, the testing session should be done in monitored<br />

testing centers.<br />

Motivate People to Give Their Best Effort<br />

Conditions of testing can have a potential effect on the testing experience, on test taker effort, and on the quality of<br />

the assessment results. Informing test takers of why they have been assigned a testing session and of the importance<br />

of providing their best effort on the assessment is often an important consideration.<br />

Many people sustain a stronger effort on the assessment when the result is of interest to them or carries some<br />

significance to them. Informing test takers of the possibility of obtaining their assessment result may be an important<br />

consideration for your testing plan. Job applicants are rarely informed of their assessment performance. Depending<br />

on the reason for testing and the details of the educational program, employees<br />

and students are typically informed of their results at appropriate times in their<br />

educational programs (during staff development or advising sessions when testing<br />

is done as a diagnostic, or at the completion of the programs when testing is a<br />

component of program evaluation or educational outcomes assessment). Even<br />

when individual results are not offered, it is often useful to communicate the value<br />

of the testing by providing the test taker group with information about the group<br />

as a whole and how the testing scores will be used.<br />

Most educators believe that learning about personal performance is a motivating<br />

factor for future effort at self-improvement. Scores on the Insight Assessment<br />

instruments are proven indicators of success in workplace transition and<br />

professional certification and licensure. [Sections 4 and 5 of this manual discuss<br />

predictive validity and reliability and provide hyperlinks to relevant publications<br />

and research reports.]<br />

Depending on the design and the objectives of your testing program, it may be useful to provide people with reports<br />

of their personal performance. Individual score reports can be easily provided to each test taker when clients use<br />

electronic testing.<br />

The decision of whether to provide assessment results to the test taker is made by the test administrator. This<br />

decision may affect the selection of test administration options. Electronic testing options enable the client to select<br />

whether or not to provide each test taker with his or her test results for viewing, printing, or emailing. In the case<br />

of paper-and-pencil testing, providing results to individual test takers must be managed by the test administrator,<br />

once CapScore assessment results have been returned to the test administrator by Insight Assessment.<br />

28 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Consider Your <strong>Test</strong> Administration Option(s)<br />

Insight Assessment instruments are administered in a variety of ways: through the institution’s learning<br />

management system (LMS), online through an Internet browser, or paper-and-pencil. (Check our website or contact<br />

us about the availability of our tablet / mobile device app option.)<br />

Insight Assessment instruments are timed. The time allowed is more than sufficient for a valid test administration in<br />

the absence of specified disability and when the selected test is appropriately matched to the test takers’ age and<br />

educational level. All of the Insight Assessment electronic test administration options include timers on screen to<br />

aid the test taker. An electronic testing session automatically submits a test for scoring at the end of the prescribed<br />

testing period, if the test has not already been submitted for scoring. Paper-and-pencil administration requires a<br />

proctor to monitor the timing of the testing session.<br />

Proctored environments are optimal in educational settings for high stakes testing whether the client has selected<br />

to test electronically or using paper-and-pencil. See Section 6 of this manual for details about the client’s<br />

responsibilities with regard to instrument protection and security.<br />

Contact us by phone (650-697-5628) or through our website to talk with one of our experienced assessment<br />

specialists about your particular test administration needs.<br />

Learning Management System (LMS)<br />

Administration<br />

Many Insight Assessment customers benefit from<br />

Learning Management Systems (LMS) that integrate into<br />

their business or institutional processes. These can<br />

provide tight security and control over hiring, training,<br />

and development processes. In addition to our standard<br />

testing services, Insight Assessment products are now<br />

capable of working seamlessly within your LMS and<br />

security systems. Our high-quality secure, encrypted<br />

solutions make critical thinking and disposition testing<br />

easily integrated into all key quality improvement<br />

processes of your company or institution.<br />

Insight Assessment testing products can be delivered through Blackboard, Moodle or many other learning<br />

management systems in use at your company or educational institution. Because these LMS products vary, and your<br />

company installation will differ, we work directly with your in-house technology representative during set-up to<br />

ensure a smooth solution.<br />

29 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Online Administration<br />

Getting Started<br />

In a hurry? Your online account can often be established within one<br />

business day. Our customer support and technology staff will work<br />

with you to meet your needs.<br />

Previewing the Online System: If you are not familiar with the assessment instrument or the use of our online<br />

system, obtaining a preview pack will help you see how our assessment instruments can best be used at your<br />

organization or institution. This user guide and technical manual accompanies a preview of the online system to<br />

provide in-depth information as needed. The preview is designed to help you to receive optimal value from your<br />

assessment adoption and to view the assessment and the assessment experience from the perspective of the<br />

individual being assessed. Each preview includes one or more<br />

opportunities to see the personal profile page, view an example of<br />

the assessment or assessments relevant to your planned project, and<br />

see how the individual being assessed will respond to and then<br />

upload their responses, and then (at the client’s option) immediately<br />

view a printable report of their scores.<br />

Full Service Online Assessment: Businesses and personal advising<br />

and counseling customers often require continuous assessment<br />

capability with real-time results delivery. Fitting the assessment to<br />

the individual test taker provides the best measure of critical thinking<br />

ability. For some, this means having a variety of instrument versions<br />

available to fit educational level, spoken language, or specialty<br />

practice. When it makes sense to have these conditions handled as a<br />

part of the purchase agreement, Insight Assessment staff can provide<br />

you with customized support service to fit your assessment plan.<br />

Hands-On Online Administration: Some customers prefer to schedule their own assessments and download group<br />

data themselves. Once you have become a client and have had an orientation to the online system, you will be able<br />

to log into the system as a test administrator to assign and schedule the assessments available to you through Insight<br />

Assessment, and to manage and download your assessment data. Each individual you assess is assigned a unique ID<br />

within the Insight Assessment testing system. This permits you to track their scores over time in the case of multiple<br />

test administrations. An array of assessment instruments can be added to a client’s online account as described<br />

below. You or a designated person at your institution or business can be provided with an online account that allows<br />

you to control all needed aspects of your online testing. The hands-on option is designed for customers who enjoy<br />

the ability to control the timing of assessment assignments and select varying options for each individual or group<br />

each time they begin a new assessment session.<br />

Personalized Telephone Orientation *: To get started with hands-on administration, adoption of online testing<br />

includes a telephone orientation to your online test administrator account. During that telephone orientation, our<br />

online technology support professionals will assist you to create your <strong>Test</strong> Administrator account. This account,<br />

secured with your unique Login and Password entry to the testing system, will provide you with continuous access<br />

30 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

to the online system capabilities. The discussion below is a brief summary of some of the information provided in this<br />

telephone orientation session.<br />

Login as a <strong>Test</strong> Administrator: Clients access their personal <strong>Test</strong> Administrator interface using the client login<br />

button located on the Insight Assessment home page. www.insightassessment.com<br />

During your orientation, you will be guided through your selection of the available online system options and making<br />

your first assessment assignment. You’ll find that the system is easy to manage and offers you the opportunity to<br />

set the options in your account to match the situation at your institution or business.<br />

As a <strong>Test</strong> Administrator you will be able to login to the Insight Assessment online system anytime, anywhere, via<br />

the Internet using your secure Login and Password.<br />

The process is quick and easy. You will learn how to:<br />

set up assessments to be taken online during the time windows you specify<br />

choose assessments in the language of the individuals you plan to assess<br />

check in to see whether assessments have been completed<br />

set options to provide (or hide) assessment results from the individuals being assessed<br />

add (or remove, with staff support) demographic questions<br />

create a report of the scores for a group of individuals who have completed an assessment<br />

sort data by demographic variable, assignment or group<br />

download an individual assessment report<br />

download a spreadsheet of your group’s assessment scores<br />

and more, as best fits your project<br />

* If you prefer to have an account with no need for you to personally make assessment assignments or request<br />

data downloads, talk to our technical staff about customized service.<br />

Checklist for Hands-On Online Assessment Administration<br />

Step 1) Assign the Login(s) and Password(s) for the individuals you plan to assess. There are two main options to<br />

consider in this process. Option One: If you are assessing a small number of individuals (e.g. hiring candidates,<br />

employees, course participants, selecting trainees) you may prefer to assign each person a unique Login and<br />

Password. Option Two: As an alternative, it may be more convenient for the individuals you plan to assess to enter<br />

the online system using one Login and Password combination (a universal Login and Password portal). If you use<br />

Option Two, the online system will then assign a unique Login and Password when the individual you assess<br />

completes and saves their demographic profile information. A more complete discussion of these options is included<br />

in your system orientation with our staff and in your support materials.<br />

Step 2) Set up the Assessment Assignment: First select the testing instrumentto be administered and the time<br />

frame during which you want it to be available for the individuals you plan to assess. If you have forgotten how to<br />

do this well, call and work with our staff to correctly set up your assignment or follow the guidelines in the help files<br />

you were provided during the orientation to your <strong>Test</strong> Administrator account.<br />

31 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 3) Give the individuals you plan to assess their assignment and Login and Password information. Inform them<br />

of when and where they may take the assignment(s) online. Do NOT permit them to examine, study, copy, review,<br />

or otherwise have access to the assignment other than during their online assessment session.<br />

Instructions for individuals being assessed: There is no need to instruct the individual you plan to assess in<br />

the use of the online system. You need only direct them to the Insight Assessment website where they will find the<br />

“<strong>Test</strong> taker Login” button and the remainder of the process is self-explanatory. However, if you would like to<br />

distribute instructions with the assessment assignment, the following page is a printable instruction sheet for using<br />

the online system. A proctor instruction sheet has also been included in this section in the event that one is required<br />

at your organization.<br />

Remember: Check Computers for Readiness<br />

Check each computer for readiness by entering Login<br />

“setup” and Password “etesting” into the dialogue box on<br />

the dark blue screen which appears after clicking the<br />

yellow “<strong>Test</strong> taker Login” button on the right hand side of<br />

our Website home page: www.insightassessment.com.<br />

After clicking the button allow, a few moments for the<br />

Insight Assessment testing interface to load using Java. If<br />

there are any difficulties, run the diagnostic tool by<br />

clicking the yellow “Click Here” link on the login screen, or<br />

see the PDF of test taker login instructions, with screen<br />

shots, located under the “About Us” tab on our<br />

homepage. Or contact Insight Assessment for technical<br />

assistance.<br />

32 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Instructions for Online System Use<br />

This testing period is timed. The time remaining for completing your assessment is visible on the timer. Please be<br />

sure that you allow yourself plenty of time and that you are completing this assessment in an environment where<br />

you can work without interruption, and if using a laptop, that you have plenty of battery life to complete the<br />

assessment.<br />

1. Open your browser and navigate to our home page: www.insightassessment.com<br />

2. Click the Yellow “<strong>Test</strong> taker Login” Button at the top right of the home page.<br />

3. When the dark blue Login screen appears enter the Login and Password you have been given for your assessment<br />

assignment:<br />

Example: Login: XYZabc2013 Password: Graduate13<br />

Note: If you have any problems with the login, you can check the configuration of your computer by using the yellow<br />

“click here” diagnostic on this login screen.<br />

4. To ensure you do not lose your responses, please review the navigational aids on the “Warning” screen and then<br />

click “Continue.”<br />

5. Give the system a moment to load Java. You will see a Java logo and progress bar on a white screen.<br />

Note: Please follow any instructions that may appear asking to “open” or “run” the Java program. If the testing<br />

system fails to open, please go to: https://members.insightassessment.com/Verify?bhcp=1<br />

6. When your personal profile page opens: Respond to all the items on this screen and then click “Save Profile.” You<br />

can click “Continue” to move to the assessment itself only after your profile is saved.<br />

7. Select the assessment you have been assigned using the pull down menu, click “Continue.”<br />

8. Accept the User Agreement Terms.<br />

9. Read the instructions and complete the assessment.<br />

10. Depending on the screen resolution of your computer, you may need to use the scroll bar to read the questions<br />

and answer choices, or to see the navigational arrows to move from question to question.<br />

12. After completing all the questions, submit your responses by clicking “Done with test/survey” – top left.<br />

13. You can see the time remaining in the timer displayed on the top right of your screen. Your responses will be<br />

submitted for scoring automatically if time expires.<br />

14. Once you’ve completed your assessment, you may Log Out, or complete another assessment if one has been<br />

assigned, or update your personal profile, or, if the test administrator has authorized it, view and print your<br />

assessment results.<br />

33 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Proctor Instructions: Online Assessment Administration<br />

The Insight Assessment Online <strong>Test</strong>ing System is completely self-directed, but to further assist the individuals you<br />

are assessing in completing their profile and completing their assessment, you may read or distribute Instructions<br />

for Online System Use.<br />

Assure that Computers are ready for testing: Check each computer for readiness by entering Login “setup”<br />

and Password “etesting” into the dialogue box on the dark blue screen which appears after clicking the yellow “<strong>Test</strong><br />

Taker Login” button on the right hand side of our Website home page: www.insightassessment.com. If there are<br />

any difficulties, run the diagnostic tool by clicking the yellow “click here” diagnostic link on the login screen, or<br />

contact Insight Assessment for technical assistance (650-697-5628).<br />

Beginning Your Assessment Session<br />

1. Direct the individual being assessed to log in from the Insight Assessment home page using the Login and Password<br />

which you provide for them.<br />

2. Provide any special instructions you may have received from the administrator assigning the assessment regarding<br />

completing the profile. Remind those being assessed to complete and then “SAVE” their personal profile. They will<br />

be able to download their assessment(s) only after their personal profile is completed and saved. The profile may<br />

include some demographic questions if these have been selected for inclusion by your organization or institution.<br />

3. You may provide this information (optional):<br />

questions can be answered in any order and answers can be changed at any time prior to submitting<br />

responses<br />

the time remaining in the testing session will be displayed on the top right of their computer screen<br />

a reminder to use scroll bars if necessary to view the questions, answer choices, and navigation buttons<br />

information about whether those being assessed will be able to view or print score reports<br />

use of scratch paper is encouraged<br />

4. If necessary, signal individuals when they may begin completing their assessment.<br />

During the testing period:<br />

Maintain an environment where those completing an assessment will not be distracted.<br />

It is important that proctors do not respond to questions or comments from individuals regarding any of the<br />

assessment items, or those seeking to clarify of any of the assessment items. Proctors should not predispose test<br />

taker performance by commenting on the assessment or any of any of its questions.<br />

34 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Paper-and-Pencil Administration<br />

CapScore General Information: Paper and pencil testing with instruments licensed through Insight Assessment uses<br />

the CapScore scoring system. All assessments that are administered using paper and pencil materials are scored at<br />

insight Assessment through the CapScore scoring system. Reports of the results of administering the assessments<br />

are returned to clients as electronic documents. Each new CapScore testing system client is invited to consult with<br />

our staff to determine how best to use paper-and-pencil testing and the CapScore scoring service at their<br />

institution or business. If paper-and-pencil testing is the best solution for your institution, agency or business, you<br />

will be purchasing all needed assessment materials from Insight Assessment. Rush delivery services are available if<br />

required to meet your planned testing date.<br />

<strong>Test</strong>ing Materials Needed: Each time you administer an assessment in paper-and-pencil format, the individual<br />

you assess should be given a clean, unused booklet and a CapScore response form. Quality control audits have<br />

shown that when assessments are administered in paper-and-pencil format, those being assessed are accustomed<br />

to marking in booklets as they reason through the items, eliminating answer choices that they do not intend to<br />

select. To assure booklets are free of comments left by others, new booklets are always supplied with each<br />

assessment purchase.<br />

CapScore response forms are date stamped to indicate their<br />

use period. Service agreements for scoring expire 12 months<br />

after purchase. Rush shipment of new booklets and<br />

CapScore response forms is available.<br />

CapScore response forms are coded with assessment name<br />

and form and care should be taken to assure that the booklet<br />

code matches the CapScore response form code. If you are<br />

using a variety of assessment forms at your institution, you<br />

may have a variety of CapScore forms at your agency. Assure<br />

that the code numbers on the booklets match the code<br />

numbers on the CapScore response forms. Combining paper<br />

and pencil assessment materials from multiple purchases is<br />

permissible, so long as the booklets and code numbers match<br />

and the assessment materials remain valid (check expiration<br />

date on CapScore form).<br />

ID Number Field: <strong>Test</strong> taker ID Number: Each CapScore response form has a field for the test taker ID<br />

Number. It is important that this number be entered on the form correctly as this is the number that will be used to<br />

deliver score reports for each test taker. To assure that there is no possibility of confusing the results of your test<br />

takers, no two answer sheets should have the same ID number. This ID number might be a student or employee ID<br />

number, or a number that is assigned to the test taker solely for the assessment session. Please do not use social<br />

security numbers for reasons of personal identity security.<br />

The ID numbers that test takers enter SHOULD IDEALLY BE NINE DIGITS LONG so that all of the boxes are filled. This<br />

assists the test taker to observe possible errors in the entry of the ID number. If you wish to use ID numbers shorter<br />

than 9 digits, it is best to use leading zeros at the beginning of the number as place holders. We recommend against<br />

inviting test takers to make up their own ID numbers, as this often leads to unexpected duplications. IMPORTANT:<br />

35 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

<strong>Test</strong> takers must also darken the bubbles that correspond to their ID number. It is the bubbles that are scanned<br />

when reporting score results and bubbles must be darkened well with No. 2 pencils.<br />

We recommend retaining a master list of each person’s name matched with his or her personal ID number. For you<br />

to match scores to individuals, test takers need to fill in the ID number section correctly. This master list should be<br />

kept in a secure place to protect the privacy of the test takers. This list will be the means by which you will be able<br />

to connect the CapScore results to the individuals who took the assessment.<br />

Group Indicator Field: On each CapScore response form there is a two-character or a three-character field<br />

that can be used to identify test takers subgroups (a department, an instructor’s section, a course or training<br />

program, a number of years of experience, research group, or any other factor or variable that can be coded into<br />

three digits). The group indicator field permits 99 two-digit fields or 999 three-digit fields to separate groups within<br />

your organization or project.<br />

Using the group indicator field means that there is no need to physically separate response forms by group when<br />

returning the batch for scanning and scoring. When scanned as one large batch, the information in the group<br />

indicator field will be all that is needed to enable the scoring system to differentiate your assessment scores by<br />

group. Basic statistics for each of your groups will be provided as a part of your results package in all cases where<br />

the sample size is adequately large (there must be at least 20 test takers in each subgroup).<br />

To use the group indicator field, simply designate a specific number for each group and instruct test takers to enter<br />

this number in the “group” field on the CapScore response form. For example, you might indicate a pretest group<br />

as group 001 and a posttest group as group 002. Or you might designate a different group number for each<br />

department in your organization or for each program in your curriculum, or for each section of a course, or each<br />

position in a set of job postings. See our website for more on how to identify and differentiate subgroups of test<br />

takers within a given batch sent for scoring.<br />

36 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Instructions for Paper and Pencil Administration<br />

Use CapScore Paper-and-Pencil Response Forms<br />

1) Begin by filling in the CapScore response form with your personal information. (Accommodations may be<br />

made for disabled test takers who require assistance with marking CapScore response forms.)<br />

2) Be sure to write your ID number in the small boxes along the top of the ID number section and then fill in the<br />

corresponding bubbles below each number. Darken in the bubbles completely and correctly.<br />

3) Use only an HB Pencil (#2 pencil in the USA) to complete your CapScore response form. Do not<br />

use pen. Do not use markers.<br />

4) (Optional) Indicate your group by darkening the appropriate bubbles. Also indicate your gender, class level,<br />

and how you identify yourself within the provided categories.<br />

5) (Unless directed otherwise) Write your name and today’s date on your response form.<br />

6) Be sure to completely erase any mistakes or stray marks on your CapScore response forms before submitting<br />

your completed assessment.<br />

7) Please be courteous to others completing assessments, and avoid causing any distractions. Please turn off all<br />

electronic devices.<br />

8) This is a multiple-choice assessment. You should select the one best answer for each question. Record your<br />

answers by darkening only one bubble for each item on the CapScore response form.<br />

9) Your testing session is timed. Be sure to consider time as you complete this assessment.<br />

10) The assessment proctor will collect your testing materials when the session is over.<br />

Proctor Instructions: Pencil-and-Paper Administration<br />

<strong>Test</strong>ing environments should allow for focused concentration on the assessment and be well lit and comfortable.<br />

Individuals completing the assessment should be well rested rather than in a state of cognitive fatigue. All electronic<br />

devices should be turned off during a testing session. Adequate time for completion of the assessment should be<br />

assured. Time on assessment may be extended if this is appropriate for test takers with documented disabilities.<br />

Before the <strong>Test</strong>ing Period:<br />

1. Bring these materials to the testing room: A fresh clean assessment booklet, a CapScore response form,<br />

and an Extra HB (USA # 2) pencil for each individual completing the assessment.<br />

2. Be aware that CapScore response forms must be marked with pencil only. Forms completed with pens or<br />

markers cannot be scored.<br />

3. Be sure that the assessment name and code number on the cover of the assessment booklets matches the<br />

name and code number on the CapScore response forms. If you have been given assessment booklets<br />

with code numbers that do not match the code numbers on the CapScore response forms, notify your<br />

<strong>Test</strong> Administrator (the individual who has assigned this assessment) or contact Insight Assessment.<br />

37 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

4. The testing session is timed. Follow instructions for time allowed. Some individuals may finish early, but<br />

many will need this entire period of time to complete the assessment.<br />

5. Additional information is included in the <strong>Test</strong> taker Instructions. You may wish to read or distribute <strong>Test</strong><br />

taker Instructions on the previous page.<br />

During the <strong>Test</strong>ing Period:<br />

It is important that assessment proctors not respond to questions seeking to clarify any of the assessment’s items.<br />

Commenting in any way may predispose test takers toward different responses.<br />

It is also important that proctors do not allow testing materials to be taken from the room. Be sure to give clear<br />

instructions about filling in the test taker ID Number and marking all responses clearly. As booklets and CapScore<br />

response forms are returned, check that test taker ID numbers have been properly entered. When the allowed time<br />

period expires, collect all copies of booklets and all CapScore response forms and verify the count.<br />

After the Assessment Session is Complete:<br />

Store all copies of the booklets and CapScore response forms in a secure area. Do not permit access to the booklets<br />

or CapScore response forms before or after the assessment session. Destroy and recycle the used booklets. Return<br />

CapScore response forms to the appropriate address for scoring as indicated on the website:<br />

www.insightassessment.com.<br />

Important Notification: Paper-and-pencil testing purchases include time-limited licenses<br />

to use test booklets. All licenses are one-time use only. Insight Assessment retains<br />

ownership of all test booklets. The booklets are leased to the client as part of the client’s<br />

purchase of test use licenses. Each license includes the right to use the booklet one time<br />

and also the scoring and score reporting for that use. The client is directed to destroy the<br />

booklets after they have been used once. Purchase of testing is an affirmation that the<br />

client understands and will comply with the licensing agreement and protect the security<br />

of the testing instrument and all of its questions.<br />

Scoring Information - CapScore<br />

Returning CapScore Response Forms to Insight Assessment for Scanning, Scoring and Results<br />

Reporting: Your purchase of paper-and-pencil testing includes CapScore scanning, scoring, and descriptive<br />

statistical analysis. Paper-and-pencil testing requires that completed CapScore response forms be returned to the<br />

company for scoring. Only original CapScore response forms can be processed to report your results. Scored results<br />

are returned using electronic file transfer. Insight Assessment provides scored data files in PDF and Excel® format.<br />

Assessment results are reported to clients by email within 20 working days of receipt of your CapScore response<br />

forms by Insight Assessment. Rush processing is available if you wish to receive results electronically in 3 working<br />

days; additional charges apply.<br />

Please note that CapScore response forms are specifically designed and encoded for each Insight Assessment<br />

instrument. Only Insight Assessment printed and encoded CapScore response forms can be accurately scanned.<br />

Please protect purchased CapScore response forms from damage and return them free of stickers, clips, staples or<br />

other attachments. Damaged CapScore response forms or photocopies of CapScore response forms cannot be<br />

accurately scanned. Receipt of damaged or photocopied CapScore response forms from any customer will result<br />

in additional fees and likely delays in delivery of score reports.<br />

38 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Checklist for Preparing CapScore Response Form:<br />

To be sure scores can be reported accurately, check CapScore response forms to assure the nine digit ID<br />

number section has been bubbled in correctly. Make sure that the written ID numbers match the marks in<br />

the bubbles below.<br />

Send your original CapScore response forms back to Insight Assessment for scoring. Make copies of the<br />

response forms to ensure that data cannot be lost in transit. Note: these copies cannot be scanned, but<br />

they will contain your data in the event that your response forms are lost by the mail services; other scoring<br />

arrangements can be made in that event.<br />

We suggest sending the CapScore response forms via registered mail or another secure courier.<br />

Complete and include the CapScore Return Form when sending your CapScore response forms for<br />

scoring. This form identifies your data and provides a return address for your assessment scores. You can<br />

download a copy of this form from the Insight Assessment website: www.insightassessment.com .<br />

Mail your original CapScore response forms to the address indicated on the website:<br />

Currently, as of the writing of this manual, the address is:<br />

Insight Assessment<br />

Attn: CapScore<br />

1735 N First Street, Suite 306,<br />

San Jose, CA 95112-4511, USA.<br />

Only Insight Assessment printed and encoded CapScore response forms can be accurately scanned.<br />

Photocopied forms will not be scored.<br />

Use the Group Indicator Field on the CapScore response forms to separate test takers into groups for<br />

scoring. (See “Group Indicator Field” above.) Note: It is not necessary to physically separate groups using<br />

paper clips, rubber bands, stickers, etc. To do so may damage the forms and prevent them from being<br />

accurately scored. The Group Indicator Field is intended to identify separate groups within the same batch<br />

of test takers. But, if different batches of CapScore response forms should be scored separately and results<br />

returned separately, send separate CapScore Return Forms with each batch indicating the details of your<br />

request. Separate batch scoring may incur additional fees for additional scoring passes.<br />

Your original CapScore response forms will not be returned to you. If necessary, these forms can be<br />

returned to you upon request, if this request is received within 12 months of scoring. The client will be<br />

responsible for any data retrieval, shipping or handling fees that may apply.<br />

Some non-English language CapScore response forms are available. Otherwise arrangements can be made to have<br />

non-English assessments scored using a spreadsheet method. This is particularly true when new translations are<br />

being developed and validated. If you have obtained this scoring contract at purchase use licenses, contact Insight<br />

Assessment for instructions when scoring of assessment data is required.<br />

39 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 3:<br />

Results Reported<br />

This section presents a step by step guide to the interpretation of the scores<br />

reported for this instrument. Assessment reporting formats include charts,<br />

statistical tables, spreadsheets, and individual reports.<br />

Interpreting CCTST Score Reports<br />

Reports of the scores of individuals are presented in<br />

spreadsheets showing all scores and demographic<br />

responses for each individual in a group, and as PDF files<br />

each showing the scale scores for a given individual.<br />

Reports of the scores of groups are presented as PDF files<br />

which include statistical tables and bar charts for each<br />

scale on the assessment instrument. This section describes<br />

each of these types of reports and how to interpret the<br />

numerical scores and recommended performance<br />

assessment scores displayed.<br />

Scores for the<br />

CCTST, CCTST-N, and CCT-G835<br />

OVERALL Score (Numerical),<br />

Percentile Score, and<br />

Recommended Performance Assessment<br />

For Each Skill Scale a<br />

Numerical Score and<br />

Each test taken provides you with four types of<br />

information about your test takers: An OVERALL Score of<br />

critical thinking skill, a recommended performance<br />

assessment of the strength of the OVERALL Score (“categorical” or “qualitative” score), the percentile ranking of the<br />

OVERALL Score when compared to a group of similar test takers, and a set of scale scores that help you to understand<br />

which of the skills areas are particularly strong and which are weaker and require training attention.<br />

Interpreting Individual <strong>Test</strong> taker Score Reports<br />

The 4 – Step Process<br />

We recommend following a four step interpretation process to fully review all of the information provided by the<br />

scores package. Each step provides insight regarding the strength of critical thinking skills in your test takers.<br />

Using this 4-step process will be the same whether your data is gathered online or in paper-and-pencil format. It is<br />

also informative to use this process both for individual test taker scores and for interpreting group scores.<br />

40 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Example 1: Interpreting an Individual Score Report<br />

Here is an example of how to interpret an individual score report using the 4-step process. This process is actually<br />

quite easy. Use the pictured report (Figure 1 on the next two pages) to see the process in action.<br />

Step 1: Examine the OVERALL Score. The OVERALL Score for this example test taker is 87. It is shown on the<br />

first page with a brief description of what the OVERALL Score means.<br />

Session Duration: It can also be informative to examine the information available regarding testing<br />

conditions. This report is of a test completed online where the time on test is captured in the report. The<br />

second page of this individual report records the testing “Session Duration” as 40 minutes and 52 seconds,<br />

adequate time to complete the test. This means that this test taker took just under 41 minutes to go from<br />

opening the first test question until the test taker submitted all responses. The Session Duration does not<br />

include time the test taker may have spent completing individual profile demographic questions prior to<br />

actually beginning the test itself.<br />

Step 2: Examine the comparison percentile. This individual’s OVERALL Score is compared to the 4-year<br />

college undergraduate percentiles, and it ranks at the 89 th percentile nationally (shown at the top of the<br />

first page under the description of the OVERALL Score, and followed by a statement to assist the test taker<br />

in understanding the difference between a “percentile” and “percent correct.”<br />

Step 3: Examine the Performance Assessment of the OVERALL Score. The performance rating for this<br />

individual’s OVERALL Score is Superior. These recommended ratings are based on peer reviewed and<br />

internal studies linking scores to learning readiness, academic program completion and work performance.<br />

This is determined by Table 3, using the CCTST-N 100-point version cut scores. The Superior Recommended<br />

Performance Assessment is also reported to the test taker (First page, first thing under the graphs).<br />

Step 4: Examine the Scale Scores. The Scale Scores indicate areas of strength and areas where improvement<br />

is needed. The Scale Scores for this individual are presented in both numerical and recommended<br />

performance assessment forms. On this test taker’s report the recommended performance assessments<br />

for Analysis, Inference, Evaluation, Induction, Deduction, Interpretation, and Numeracy are Superior, for<br />

Explanation skills the recommended performance assessment is Strong.<br />

41 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Figure 1: Sample Individual <strong>Test</strong> taker Report (page 1)<br />

42 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Figure 1 Continued: Sample Individual <strong>Test</strong> Taker Report (page 2)<br />

43 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Interpreting Spreadsheet Score Reports<br />

If you are testing online, you can download individual scores in spreadsheet form. If you are testing in paper-andpencil<br />

format, individual scores are sent to you in spreadsheet form when you return your CapScore response<br />

forms for scoring. If you are using one of our full-service options, spreadsheet reports are provided as per your<br />

original instructions or upon request.<br />

Example 2: Interpreting the Spreadsheet Score Report<br />

Here is an example of how to interpret the spreadsheet report that lists the scores for each of your individual test<br />

takers. The partial spreadsheet showing demographic responses, Table 1A below, displays information supplied by<br />

the client who administered the test, such as Assignment Description, Assignment Number and Group. And it shows<br />

each individual’s responses to questions asking for Name, Email Address, Age, Gender, Ethnicity, and other<br />

demographic information which the client test administrator sought to gather using the custom question feature of<br />

the testing interface.<br />

In this example, which is a sample of college undergraduates, the client test administrator asked the students to<br />

indicate undergraduate year, school, and academic major. Note that in some cases individuals elected not to supply<br />

the information requested. One person did not respond to custom question #2 or custom question #3. And that<br />

same person elected the system response “I choose not to provide this information” for “Gender” and “Ethnicity.”<br />

The other eleven responded to all the demographic questions. For privacy reasons names and emails have been<br />

redacted from this example.<br />

Table 1A: Partial Spreadsheet Report of Individual Demographics (right side)<br />

44 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

The spreadsheet also includes columns reporting the OVERALL Score, the Comparison Percentile, the Scale Scores,<br />

and available information about the testing conditions. Table 1B is an example graphic to demonstrate the<br />

interpretation of these scores.<br />

Table 1B: Partial Spreadsheet Report of Individual Scores (left side)<br />

Table 1B displays test score information about individual test takers. The individual’s OVERALL Score, Comparison<br />

Percentile, and Scale Scores are in the columns with the tan header. If this client had chosen to use the CCTST-N,<br />

there would be an additional column listing scores for Numeracy. The next column, with the blue header, indicates<br />

the percentage of the questions on the test to which the individual responded. “1” means that the individual<br />

responded to 100% of the questions. And the right hand column in Table 1B shows how many minutes the test taker<br />

spent on the test.<br />

If the client’s test takers completed the instrument using the Insight Assessment online testing system, the numbers<br />

reported under the ID variable column represent the unique identifiers created for that individual by our online<br />

testing system. If a client using our online testing system wishes to have individual test takers use some other<br />

identification number, it is recommended that the client create a custom profile question and inform the test takers<br />

how to respond to that custom question. If the score report is produced from paper-and-pencil testing using<br />

CapScore, or if the instrument is administered from within the client’s LMS (learning management system), then<br />

the data in this column is the client defined ID.<br />

Example 2: Interpreting the Spreadsheet Score Report (Continued)<br />

The CCTST OVERALL Score is the best overall measure of critical thinking skills when the purpose is to compare<br />

individuals or groups of individuals. The OVERALL Score on this family of critical thinking skills tests has been shown<br />

to predict success in workplace contexts, the successful completion of educational programs, and passing scores on<br />

certification and licensure examinations.<br />

45 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 1: Interpret Each Individual’s OVERALL Score<br />

CCTST OVERALL Score Interpretation: Going back to Table 1B, if you examine the CCTST OVERALL Score, you<br />

can observe that the CCTST OVERALL Scores for these twelve tests range from 7 to 31. In any sample of test takers<br />

there is likely to be a range of values for CCTST OVERALL Score. If you use an agency-specific numerical cut score for<br />

the CCTST OVERALL Score at your agency, reference it against the information in the spreadsheet column headed<br />

OVERALL for this purpose.<br />

Examine Any Information Provided About<br />

<strong>Test</strong>-Taking Behavior. Two columns on the<br />

spreadsheet report provide information about testtaking<br />

behavior, namely each person’s minutes on<br />

test, and each person’s percent answered.<br />

Minutes on <strong>Test</strong>: This is a good time to examine<br />

whether the parameters of testing are as expected<br />

for each test taker. The CCTST is intended to be<br />

challenging. In contrast to reactive thinking or<br />

memory recall tests, a test of reflective thinking skills<br />

takes a bit of time. Reading test items and<br />

responding thoughtfully to each one demands more<br />

than 15 minutes of cognitive effort. As a conservative<br />

indicator of a false test results, we recommend<br />

discarding tests if the test taker gave less than 15<br />

minutes effort.<br />

Table 1B shows that one individual, ID 477990, completed the test in only 15 minutes. This person is not likely to<br />

have provided sufficient effort to submit a true test of their critical thinking ability. This person’s CCTST OVERALL<br />

Score of 7, and therefore the percentile ranking of 2 nd percentile, are very probably falsely low. Individuals spending<br />

less than 15 minutes on a skills test, like the CCTST, probably have not given their best sustained cognitive effort, in<br />

which case they may not have accurately represented their true level of skill.<br />

Percent Answered (Ratio): This column in the spreadsheet reports on the ratio of items answered compared<br />

to a test when all questions are answered (ratio = 1). Most people complete all items on the test in the time allowed,<br />

but some individuals leave one or more items unanswered. <strong>Test</strong>s with large numbers of items left unanswered may<br />

indicate language comprehension difficulties, reading issues, poor effort, or poor time management.<br />

For Table 1B all but one of the test takers responded to every item on the test. ID 433898 left some questions<br />

unanswered as indicated by the reported ratio of .88. Ratios of less than 0.60 (60% of items completed) are unusual,<br />

as most test takers complete all questions on the CCTST in the allotted time. <strong>Test</strong>s submitted with fewer than 60% of<br />

items completed may indicate language comprehension difficulties, reading issues, poor effort, or poor time<br />

management.<br />

Step 2: Examine Individual Comparison Percentile Scores<br />

An individual’s Percentile Score is based on that test taker’s OVERALL Score. The Percentile Score<br />

compares the test taker with the external benchmark comparison group (e.g. a national sample of test takers similar<br />

to the group being tested). Clients can select a comparison group most like their test sample each time they test a<br />

new group of test takers. Within any sample of test takers there is likely to be a wide range of CCTST OVERALL Scores<br />

46 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

and a wide range of the corresponding percentile rankings. If you use an agency-specific percentile cut score for the<br />

CCTST, reference it to the reported comparison percentile score.<br />

For example, the comparison percentiles, in the column marked Percentile, for the sample in Table 1A range from<br />

the 2 nd to the 98 th percentile.<br />

A score that falls at the 60 th percentile means that roughly 59 people out of 100 will score lower than this test taker<br />

and 40 persons out of 100 will score higher than this test taker in the national comparison group.<br />

Available CCTST & CCTST-N Comparison Percentiles<br />

At the time of this edition of the user manual, the list comparison groups included:<br />

National technical and community college students (2-yr. colleges)<br />

National undergraduate students (4-yr. colleges and universities)<br />

National graduate students and professionals<br />

National health science undergraduates (4-yr. colleges and universities)<br />

National health science graduates and professionals<br />

G835 college graduates and professionals<br />

Check the website for the most recently updated list of available CCTST and CCTST-N comparison<br />

groups. Inquire by phone or email about customized comparison groups.<br />

Determine the Strength of the Scores<br />

OVERALL Scores can be interpreted as to their relative strength using recommended performance assessment<br />

descriptors. This is useful for studying both individuals and professional cohorts (Table 2).<br />

Superior: This result indicates critical thinking skill that is superior to the vast majority of test takers. <strong>Skills</strong> at the<br />

superior level are consistent with the potential for more advanced learning and leadership.<br />

Strong: This result is consistent with the potential for academic success and career development.<br />

Moderate: This result indicates the potential for skills-related challenges when engaged in reflective problemsolving<br />

and reflective decision-making associated with learning or employee development.<br />

Weak: This result is predictive of difficulties with educational and employment related demands for reflective<br />

problem solving and reflective decision making.<br />

Not Manifested: This result is consistent with possible insufficient test taker effort, cognitive fatigue, or possible<br />

reading or language comprehension issues.<br />

Table 2: Descriptions of Recommended Performance Assessments OVERALL Scores<br />

47 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Table 3 displays the Recommended Performance Assessments for the OVERALL Score on the CCTST, CCTST-N, and<br />

CCT-G835.<br />

CCTST / CCTST-N /<br />

CCTST OVERALL Score<br />

34-point<br />

Form 2000 versions<br />

CCTST and CCTST-N<br />

OVERALL Score<br />

100-point versions<br />

Not<br />

Manifested<br />

Recommended Performance Assessments<br />

Weak Moderate Strong Superior<br />

0-7 8-12 13-18 19-23<br />

24 or<br />

higher<br />

50-62 63-69 70-78 79-85 86 or higher<br />

50-65 NA 66-74 75-84<br />

85 or<br />

Table 3: Recommended Performance Assessments for the OVERALL Score<br />

To interpret the strength of the OVERALL Score, use the row in Table 3 that corresponds to the version of the test<br />

administered.<br />

For the CCTST / CCTST-N OVERALL Score reported on a 100-point version, a score of 86 and higher indicates critical<br />

thinking skill that is superior to the vast majority of test takers and for this reason is designated as Superior. Scores<br />

in this range are associated with strong preceptor ratings and work performance and are indicative of leadership<br />

potential. On this same 100-point version scores less than 70 display weak overall skill or no manifestation of critical<br />

thinking skill and have been associated with poor performance educationally, in the workplace, and on professional<br />

licensure examination.<br />

For example, refer again to Table 1B, there the OVERALL Scores for individual test takers ranges from 7 to 31. The<br />

test taker score of 7 corresponds to the recommended performance assessment of Not Manifested. The test taker<br />

score of 31 on the 34-point version of the CCTST demonstrates superior overall skill in critical thinking.<br />

Comparing all the OVERALL Scores in Table 1B to the Recommended Performance Assessments Table (Table 3), using<br />

the 34-point CCTST row in the table, it can be seen that one person did not manifest critical thinking skill, two people<br />

displayed Weak overall skill, six fell into the Moderate recommended performance assessment level, two showed<br />

Strong overall skill and one displayed Superior critical thinking skill overall. The recommended performance<br />

assessments of the individual CCTST OVERALL Scores allows the observation that, with the exception of one score<br />

which is not likely to be a true score, this group of test takers demonstrates that they have generally moderate skills,<br />

but that a couple of people are weak, a couple strong, and one individual is exceptionally skilled in critical thinking.<br />

48 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 4: Examine the Performance Assessment of the Scale Scores<br />

The Scale Scores are useful for identifying areas of strength in the individual and areas of relative weakness that<br />

should be addressed in subsequent educational opportunities. Although the specific skill scores reported have<br />

internal consistency reliability, test-retest reliability, and strong value as indicators of specific strengths and<br />

weaknesses, they are not independent factors; which is theoretically appropriate to the holistic conceptualization of<br />

critical thinking as the process of reasoned and reflective judgment, rather than simply a list of discrete skills.<br />

Referring again to Table 1B, examine the individual scale scores for each of the twelve test takers. In each case, use<br />

Table 4. <strong>Test</strong> taker 477990, who submitted a hastily completed test ranking at the 2 nd percentile nationally, has scale<br />

scores reflecting a recommended performance assessment of Not Manifested in each scale area.<br />

Expanded testing options offering additional scales and alternative score ranges require that recommended<br />

performance assessments of your CCTST scale scores be made with a cut score table that corresponds to the form<br />

of the test that was administered. Table 4 displays cut scores for interpreting the recommended performance<br />

assessment of the 34-point CCTST scale scores, Table 6 does the same for the newer 100-point versions.<br />

(34-point version)<br />

Not Manifested Moderate Strong<br />

Analysis 0 - 2 3 - 4 5 or more<br />

Inference 0 - 5 6 - 11 12 or more<br />

Evaluation 0 - 3 4 - 7 8 or more<br />

Induction 0 - 5 6 - 11 12 or more<br />

Deduction 0 - 5 6 - 11 12 or more<br />

Table 4: Recommended Performance Assessments 34-Point CCTST Scale Scores<br />

Example 2: Interpreting Spreadsheet Score Reports (Continued)<br />

For your convenience, Table 5 below is a reprint of the score report provided in Table 1B. This time there are colors<br />

noting the corresponding recommended performance assessments. Blue scores are Strong and red indicate that the<br />

specific skill being measured was Not Manifested.<br />

49 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Table 5: Example of Scale Score Interpretation<br />

In this small sample there is one individual test taker who is strong in all five of the reported scale areas and one that<br />

is strong in two areas. And there is one test taker whose scores on each scale indicate that the skill being measured<br />

was not manifested. One person has strengths in Analysis, but difficulties with Evaluation. Other test takers<br />

generally score in the moderate range for each scale. In this sample Analysis scores are generally strong. In the next<br />

portion of this manual, group scores are examined more closely.<br />

Tables 6 and 7 provide the recommended performance assessments for the 100-point forms of the CCTST scale<br />

scores and for the CCT-G835 scale scores.<br />

50 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

CCTST and<br />

(100-point<br />

versions)<br />

Not Manifested Weak Moderate Strong Superior<br />

50-62 63-69 70-78 79-85 86-100<br />

Numeracy<br />

Table 6: Recommended Performance Assessments for 100-Point CCTST Scale Scores<br />

These are online versions of the test that offer expanded scale scores.<br />

Only the CCTST-N reports a Numeracy score<br />

Not Manifested Moderate Strong Superior<br />

50-65 66-74 75-84 85-100<br />

Table 7: Recommended Performance Assessments for CCT-G835 Scale Scores<br />

51 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Interpreting Group Score Reports<br />

Interpreting group score reports follows the same 4-step process used to interpret individual scores. In this case the<br />

emphasis is placed on the meaning of the scores for the group as a whole.<br />

Step 1:<br />

Step 2:<br />

Step 3:<br />

Step 4:<br />

Examine the value of the mean OVERALL Score for the group of test takers.<br />

Examine the Percentile Ranking, which is the average of the percentile scores of the test takers<br />

in this group.<br />

Determine the strength of the mean OVERALL Score using the Recommended Performance<br />

Assessments table.<br />

Interpret the mean Scale Scores for this group of test takers.<br />

The Table of Statistics and the Group Histogram<br />

Included in the results package for hands on administration is an analysis of the basic statistics describing the score<br />

distributions for a group of individuals who completed the same assessment assignment, e.g. a group of exiting<br />

students from XYZ University all of whom completed an assignment to take the CCTST online. Table 8 displays<br />

statistical information generated from the scores which the members of this group achieved on that assignment.<br />

Table 8: Group Scores for XYZ University<br />

52 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 1: Interpret the Group’s Mean OVERALL Score<br />

The group’s mean OVERALL Score is the average of the OVERALL Scores for each member of the group tested and<br />

the best comprehensive measure of the critical thinking skills of the group as a whole. This number is useful as<br />

documentation of the level of achievement of learning goals set for the group as a whole. Examining changes in the<br />

mean scores for testing groups over time makes it possible to assess the effectiveness of critical thinking skills staff<br />

or student development programs.<br />

Example 3: Interpreting Group Score Reports<br />

For example, the mean OVERALL Score for XYZ University is 75.8. Notice that there are 438 test takers in this sample.<br />

We can also see that the OVERALL Scores in the group range from 58 (minimum score) to 94 (maximum score). The<br />

25 th percentile for this group from XYZ University is 71 (Quartile1) and the 75 th percentile score is 80 (Quartile 3).<br />

Figure 2 below displays the score distribution. How should this group of scores be interpreted? Are the scores<br />

adequately strong? To answer these questions, complete steps 2-4.<br />

Figure 2: OVERALL Score Distribution for XYZ University - Undergraduate Sample<br />

53 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Visit YouTube to view our video on how to interpret Group Score Histograms.<br />

Step 2: Examine the Mean of the Percentile Scores of the Group<br />

In this case, the scores from XYZ University have been compared to national comparison percentile for four-year<br />

college students. This is the comparison group chosen by the client. For other available comparison groups see the<br />

previous section of consult our website for the most up to date listing. The percentile reported for the group is the<br />

mean of the percentile scores of each individual. In this case the group percentile reported for XYZ University is the<br />

46 th . The group as a whole is just below the national undergraduate comparison percentile for critical thinking skills.<br />

Although, some test takers in the group are very weak, others are exceptionally strong.<br />

Use the Insight Assessment Report Generator Online<br />

The statistical table group report, the group histogram, and the spreadsheet of individual scores can<br />

be created and downloaded by customers using our online system. All completed tests in the selected<br />

data set which have at least 60% of the questions answered are included in the analysis.<br />

In the case of testing administered online, as a quality enhancement, only submitted tests where time<br />

on test is at least 15 minutes are included in the group analysis. Mean scores are negatively and falsely<br />

affected when incomplete assessments are included in the group analysis.<br />

Spreadsheet reports, however, do include all individual test results, regardless of time on test or<br />

percent answered. Additional discussion regarding the handling of false test attempts is included<br />

below<br />

54 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 3: Determine the Strength of the Mean OVERALL Score Using the<br />

Recommended Performance Assessments Table<br />

The colored bars indicate how many of the 438 fall within each of the five recommended performance assessment<br />

levels identified, with red indicating that critical thinking skills were Not Manifested, orange showing Weak overall<br />

skills, yellow indicating Moderate skills, green showing Strong skills, and blue indicating Superior overall critical<br />

thinking skills. Using the recommended cut scores that correspond to the 100-point versions of the CCTST (Table 2),<br />

Figure 3 shows how the CCTST OVERALL Scores array across the recommended performance assessment levels in<br />

this group of test takers. Notice that this is the same CCTST OVERALL Score distribution that was displayed in Figure<br />

2, but this time the recommended performance assessments are marked. Few test takers in this group have scores<br />

that are Not Manifested or Weak. Even though the group as a whole scores very near the national mean for its<br />

selected benchmark comparison group, there are many scores in the Strong range and also scores in the Superior<br />

range.<br />

Figure 3: Recommended Performance Assessments of the XYZ University Sample<br />

To complete this analysis of the group of 438, we need only to examine the CCTST scale scores to see where this<br />

group was particularly weak and where they were strong.<br />

55 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Step 4: Interpret the Group’s Scale Scores for this group of test takers.<br />

Scale scores are important for identifying areas of strength and weakness. When the group is representative of your<br />

program or institution or company, your group scores can give direction to the development of programs to help<br />

employees and students improve their critical thinking skills. For example, if the group is relatively weak in one or<br />

more skill areas (Analysis, Inference, Evaluation, Inductive, or Deductive Reasoning skills), novel scenarios, case<br />

studies, or group problem-solving exercises can be designed to emphasize and practice those skills.<br />

Table 9 (Reprinted Table 8): Group Scores for XYZ University<br />

Using Table 3 to interpret these OVERALL and Table 6 to interpret the Scale Scores, we see that this group has more<br />

strength in Analysis and Interpretation, but weaknesses which can be addressed in Evaluation, Explanation, and<br />

Deduction. Here blue highlighted scores indicate that scores fall in the Superior recommended performance<br />

assessment level, green shows scores at the Strong level on average, yellow indicates Moderate scores, orange<br />

highlights Weak scores, and red indicates Not Manifested ).<br />

Looking at the minimum and maximum scores within each skill, we see within each skill at least one test taker who<br />

does not manifest that skill and at least one who shows a superior level of that skill.<br />

56 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Q1 scores (bottom 25% of this sample) are, on average, in the Moderate range. Q3 scores (top 25% of this sample)<br />

are in the Strong range for the skill areas Inference, Evaluation, Explanation, Induction, and Deduction and in the<br />

Superior range for Analysis and for Interpretation.<br />

Half of the sample is strong in Analysis, Interpretation, and Inductive Reasoning. Future training might best be<br />

focused on Evaluation skills, Explanation skills, and Deductive Reasoning skills.<br />

Scores for each of the scales are presented in histograms separately in the group report. Above each is a bar that<br />

describes the basic statistics for the scale scores. The example below is the Numeracy scale, such as the one included<br />

in the CCTST-N instrument. The Numeracy measure, like the other Scale Scores on the CCTST-N, reports scores on a<br />

100-point scale.<br />

By applying Table 8 or by reference to the histogram displayed, one can determine that the group’s mean score of<br />

76.6 falls within the Moderate range. By reference to Table 8 we can infer that the Q1 score of 71 implies that the<br />

top 75% of this group of 1005 test takers score in the Moderate range or higher. The Q3 score of 82 indicates that<br />

at least the top 25% score in the Strong or Superior ranges. By adding the number of test takers as indicated along<br />

the left axis for the orange and red bars, we can determine that roughly 200 of these 1005 individuals have weak<br />

numeracy skills or were not able to manifest their numeracy skills. Figure 5 provides a graphic distribution of the<br />

Numeracy scores for this example test taker group.<br />

Figure 4: Distribution of Numeracy Scores for ABCD University<br />

57 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Figure 5: Distributions of Scale Scores for QRST University<br />

Figure 5 shows individual histograms (bar charts displaying the frequency of Scale Scores for each metric measured<br />

by versions of the critical thinking skills tests that include Numeracy). These can be used to display the relative<br />

strength of scores in each of the scale areas.<br />

58 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Important Considerations When<br />

Analyzing Score Reports<br />

Difference Scores, Gains, Outliers, Discarding False <strong>Test</strong>s …<br />

Do specific critical thinking skill scores represent independent factors? No. Here is why. When we talk about<br />

critical thinking we are discussing a holistic human reasoning process which results in a singular judgment about<br />

what to believe or what to do. Like safe driving, a person can’t just be proficient at the gas pedal skills, and we<br />

cannot evaluate gas pedal skills without considering how those skills go into the whole process of safe driving. The<br />

two-thousand year old traditional delineation of reasoning skills, which divides them into deductive or inductive,<br />

cross cuts the APA Delphi Report’s list of core critical thinking skills. This means that any given inference or analysis<br />

or interpretation, for example, might be classified as deductive or as inductive, depending upon how the theoretician<br />

conceives of these more traditional categories. Conceptually the skills in the Delphi list are not necessarily discrete<br />

cognitive functions either, but in actual practice are used in combination during the process of forming a reasoned<br />

judgment, that is critical thinking. Although, in some contexts a given skill can be considered foremost, even though<br />

other skills are also being used. For example, a given test question may call heavily upon a test taker’s numeracy<br />

skills, while at the same time requiring the correct application of the person’s analytical and interpretive skills. For<br />

these reasons, and others relating to test design and cognitive endurance, the questions on the CCTST in its various<br />

versions, may or may not be used on more than one scale. As a result, although the specific skill scores reported<br />

have internal consistency reliability, test-retest reliability, and strong value as indicators of specific strengths and<br />

Educationally Significant Gains in Group Mean Scores: A score improvement of even one point for a<br />

given individual signifies that this individual correctly analyzed a scenario and identified the correct response while<br />

not falling victim to other common reasoning errors presented in other response choices. In some groups some<br />

individuals will demonstrate larger gains as the result of an educational program focused and there may be others<br />

who demonstrate no improvement. See additional comments below in “Difference Scores”. As a result a mean score<br />

improvement for the group of one point from pretest to posttest is indicative of some degree of educationally<br />

significant improvement. Larger samples will deliver a statistically significant result (p

Difference Scores – Numerical and Recommended Performance Assessment (Qualitative): When<br />

the same individuals have taken the assessment at two time points (before and after a treatment designed to train<br />

critical thinking skills), one can measure gains by examining individual difference scores for each individual. (T2 - T1<br />

where the “T” stands for “Time”). 17 These are valuable data for examining the effects of an educational or training<br />

initiative. Individual difference scores tend to be rather varied and can be used to further explore the effectiveness<br />

of a training intervention. Gains can be easily seen as positive values for ‘X’ in the equation (Score at Time 2 minus<br />

Score at Time 1 = X). Negative values are also possible and, if they are true scores, require equal attention. Difference<br />

scores obtained in this manner are numerical findings which may or may not be statistically significant. A further<br />

discussion of difference scores can be found below.<br />

Gains in Relationship to Sample Size: Sample size is an important factor in statistical analysis. Larger gains<br />

are required for statistical significance to be attained in smaller sized samples. A group gain of two points is<br />

educationally significant for the group overall and likely represents very significant gains in many individuals within<br />

the group. If there are fewer than 30 persons in the group, however, statistical tests will report this range of gain as<br />

insignificant numerically.<br />

Figure 6: Pretest and Posttest OVERALL Scores Comparison<br />

Representativeness: We recommend caution when attempting to generalize from small sample results to<br />

assumptions about the population as a whole, unless the sample of test takers is representative of the larger<br />

population. For example, the test results from a sample of 200 students, all of whom have volunteered to be tested,<br />

may not be representative of the larger population of students. Similarly, test scores from a sample of freshmen who<br />

are residential students may not be representative of the larger population of undergraduates if this larger group<br />

includes distance learners, transfer students, and adult part-time students.<br />

Independent vs. Matched-pairs Comparisons: Group comparisons can be analyzed statistically in a<br />

matched-pairs approach (which associates each individual’s posttest score with his or her pretest score) or, when<br />

the groups are not composed of exactly the same set of individuals, as aggregations of scores. When possible, we<br />

recommend using the matched-pairs approach for pretest posttest comparisons.<br />

Outliers: ‘Outlier’ is the term used to refer to a data point that is out of scale from the other data in the sample<br />

(and therefore possibly less valuable for describing the sample.) But in the case of scores on the CCTST or other<br />

17<br />

For example, in a pretest-posttest design the difference score is the posttest score minus the pretest score, that is T 2 – T 1 .<br />

60 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

critical thinking skills tests at Insight Assessment, very high and very low scores are not simply outliers. These critical<br />

thinking skills tests are constructed to achieve accurate scores throughout the continuum. Outliers should be<br />

regarded as true scores unless there is some other reason to regard them as false scores. It is important to remember<br />

that there may be other theoretically justifiable and preferable reasons to eliminate scores from the calculation of<br />

the group mean, particularly when your data is being used to justify funding or to evaluate the effectiveness of a<br />

particular training program (because they are likely to be false scores).<br />

Two justifiable reasons to eliminate scores include 1) tests where less than 60% of the items have been answered<br />

cognition, so the observation of a significant drop in OVERALL Score from pretest to posttest for a given<br />

individual is an indicator of a false test at posttest. One can examine difference scores from pretest to<br />

posttest (posttest score – pretest score = difference score) and conservatively set a value as worthy of<br />

further examination and possibly indicative of a likely false posttest score (any difference score that is equal<br />

to or less than -3).<br />

Meta-cognition: Deliberative decision making and reflective problem solving, as contrasted with making snap<br />

judgments, include the skills of self-monitoring and self-correction when necessary. Often referred to as “metacognition”,<br />

this feature of critical thinking is dependent on the interplay of the other core critical thinking skills --<br />

evaluating an analysis, explaining an interpretation, or inferring from an evaluation, for example. The OVERALL Score<br />

is considered the best estimation of the strength of meta-cognition.<br />

Proficiency and Competency <strong>Test</strong>ing: The testing client can determine the operational meaning of<br />

“proficient” or “competent” as best fits its needs and purposes in several ways. The client may elect to identify a<br />

numerical score, an external percentile score, a recommended performance assessment, a minimum pretest or<br />

posttest score, or a locally determined benchmark score which test takers must achieve in order to be regarded as<br />

having demonstrated proficiency or competency in critical thinking for the client’s assessment purposes.<br />

Assessing the Effects of an Educational Program<br />

CCTST scores will improve with the effective training of reasoning skills. There are a number of ways to<br />

document quality improvement in critical thinking skills for your institution or organization. One method is compare<br />

the difference in the mean scores for your group at Time-1 (First Assessment: Pretest) with the scores at Time-2<br />

(Second Assessment, occurring after some training program: Posttest). Do this by subtracting the group’s Time-1<br />

mean score from the group’s Time-2 mean score. This method will compare the scores as a whole rather than look<br />

at changes within the individual. The expected positive change from Pretest to Posttest will not tell you how many<br />

of the individual have improved, nor will it describe the range of improvement made by each individual.<br />

To learn more about the proportion of individuals who have improved and the degree to which they have improved,<br />

the best method is to calculate the difference scores from Time 1 (Pretest) to Time 2 (Posttest) for each person you<br />

have assessed. The OVERALL Score is the score that is best used to calculate difference scores (difference score =<br />

OVERALL Score at posttest minus the OVERALL Score at pretest). Individuals will have made different progress as the<br />

result of an educational offering aimed at building critical thinking skills. Some may not have improved their critical<br />

thinking skills during the interim time period. If the difference score is at or near zero, they have shown no effect as<br />

a result of the educational training program.<br />

A third option is to calculate difference scores for the comparison percentiles reported for each individual at Pretest<br />

and Posttest. Example: Improving your group’s overall percentile score from 52nd to 58th percentile demonstrates<br />

a significant overall gain in critical thinking scores.<br />

Individual difference scores are the most informative of the effectiveness of training techniques. In any given sample<br />

when individuals test twice (‘pretest-before training’ and ‘posttest-after training’), individual difference scores<br />

(posttest score minus pretest score) will demonstrate individual gains in critical thinking scores. The demonstration<br />

of a two-point gain in OVERALL Score from pretest to posttest is associated with becoming more metacognitive<br />

about the reasoning process and with observable evidence of efforts to actively build reasoning skills. Large<br />

individual difference scores are observed in most pretest posttest studies for at least some individuals.<br />

Figure 7 below is an example of a graph of difference scores. In this case it is displaying difference scores for the<br />

HSRT (Health Sciences Reasoning <strong>Test</strong>) OVERALL Score. The HSRT is a version of the CCTST designed for use with<br />

students and professionals in health care and the health sciences.<br />

62 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

25<br />

HSRT OVERALL difference scores pretest to posttest<br />

20<br />

Frequency<br />

0<br />

-6<br />

-3<br />

Diff1<br />

Figure 7: Difference Scores Comparing Pretest with Posttest Scores<br />

Increases in OVERALL Scores of 2 or more points (darker blue) are evidence of effective training programs. Gains of<br />

4 or more points (brighter blue) are exceptional. These difference scores are the most informative data for analyzing<br />

an educational or training program’s impact on individual trainee critical thinking skills.<br />

Losses in scores of 3 or more points are not expected. Scores or percentiles that drop significantly at posttest are<br />

very rare, as critical thinking skills are not lost over a short period of time in the absence of cognitive injury, chemical<br />

impairment, or a failure to give or to be able to give best effort. Again, other reported data (less than 15 minutes<br />

time on test at posttest or a ratio of items completed that is less than .60 at posttest, both indicative of poor effort<br />

at posttest) may explain dropped scores at posttest.<br />

63 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 4:<br />

Validity & Reliability<br />

This section provides important information relating to the validity<br />

and reliability of Insight Assessment’s instruments. Major topics<br />

include content, construct, criterion (predictive) validity, internal<br />

consistency, and test-retest reliability. Included are hyperlinks to<br />

published research reports about the validity and reliability of the<br />

Content, Construct, and Criterion<br />

(Predictive) Validity<br />

At Insight Assessment we take the measurement of reasoning skills and mindset very<br />

seriously. Our products measuring reasoning skills and mindset have been studied in a variety of populations and<br />

contexts over the past 25 years. In each case, items/scales are piloted in target samples and validated in replicated<br />

studies (undergraduate students, graduate students, employees and trainees, military officers and enlisted<br />

personnel, children at every K-12 grade level, health professionals across the spectrum of health care disciplines,<br />

law students, MBA students, technical and community college students, and the general population) to assure the<br />

performance of the assessments in the intended population. Likert style items that measure mindset are grouped in<br />

scales with demonstrated validity and reliability and are tested against social desirability bias and cultural bias.<br />

Multiple choice items that measure reasoning skills are the result of an item pool tested over a 40-year period to<br />

define item difficulty and scale membership. Built on a growing science of the measurement of human decisionmaking,<br />

each instrument has been psychometrically evaluated in collaboration with researchers, educators, trainers,<br />

and working professionals, to assure cultural and language competence in the intended test taker group. Validity<br />

and reliability coefficients meet the highest standards for all instruments.<br />

Measurement science provides clear evidence that higher-order cognitive skills, such as critical thinking, can be<br />

measured validly and reliably by well-crafted multiple choice items. Insight Assessment’s researcher led instrument<br />

development program, which began in the 1970s, has demonstrated instrument performance. Our customers rely<br />

on this quality in hundreds of independent research studies carried out by researchers and educators throughout<br />

the world.<br />

The lead researchers and test developers gratefully acknowledge our many international colleagues who have<br />

worked to establish the validity and reliability of the translated instruments, our many health care, business, law,<br />

and military professionals who advised on the production of discipline tailored measures and the INSIGHT<br />

professional line, and the additional validation work in reading comprehension done by Dr. Joanne Carter Wells of<br />

64 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

<strong>California</strong> State University Fullerton, and the psychometric consultation and focus on numeracy contributed by Dr.<br />

Carol Gittens of Santa Clara University.<br />

The information in this section on validity and reliability applies to all the reasoning skills instruments offered by<br />

Insight Assessment, which currently includes the CCTST, CCTST-N, CCT-G835, BCTST, BCTST-N, HSRT, HSRT-N, TER,<br />

TER-N, BRT, CCTST M-Series, and the skills sections of two-part tests in the MDCTI, LSRP, TRAA, and INSIGHT series.<br />

And it applies as well as the related attribute measures focusing on reasoning dispositions and habits of mind, namely<br />

the CCTDI, BAI, CM3, and the first parts of the two-part tests in the MDCTI, LSRP, TRAA, and INSIGHT series. Because<br />

skills test questions and attribute measure prompts for all of these instruments are drawn from extensive item pools<br />

which have been developed and validated through decades of testing, ease of reading demands that we largely<br />

reference only the CCTST in the paragraphs below.<br />

Content Validity<br />

Content Validity refers to the ability of a test to capture a measure of the intended domain. Identification of the<br />

pertinent domain, and obtaining agreement on it, are of primary importance to content validation. 18 A second<br />

criterion of content validity is assuring that “sensible” methods of test construction are employed. 19 In the case of<br />

the CCTST, the specified domain is critical thinking as defined by the Delphi group and discussed in Sections 1 and 5.<br />

<strong>Critical</strong> thinking, as defined by the APA Delphi study, 20 is a construct which integrates a number of cognitive<br />

maneuvers known to be a component of this type of human reasoning process. These maneuvers are included in<br />

the APA Delphi study report as embedded concepts. Analysis, inference, and evaluation, are examples. Each version<br />

of the CCTST is designed as a holistic measure of the construct <strong>Critical</strong> <strong>Thinking</strong>, with embedded scales that can be<br />

used to examine the embedded concepts as well.<br />

The content validity of the CCTST is further supported by the choice made by educators in the field of human<br />

reasoning, researchers and doctoral dissertation scholars studying human reasoning skills, and human resources<br />

professionals seeking to hire employees with strong decision skills, who adopt the CCTST. Validity of measurement<br />

also requires that the testing instrument must be free of unintended distractors that influence the response choice<br />

of groups of test takers and be calibrated to the intended test taker group. <strong>Test</strong> administrators are cautioned to<br />

assure that the CCTST matches the educational and reading level of the planned test taker group.<br />

In all of the <strong>California</strong> family of critical thinking skills tests, test takers are challenged to form reasoned judgments<br />

based on a short scenario presented in the question stem. The CCTST does NOT test any content area knowledge.<br />

CCTST questions are framed in the context of everyday concerns. All necessary information needed to answer the<br />

question correctly is presented in the question stem. The fact that the CCTST measures only critical thinking and not<br />

content knowledge makes it possible to use this instrument as a pretest and posttest to measure improvement in<br />

critical thinking that occurs during any educational program or staff development exercise.<br />

For a valid measure of critical thinking, the instrument must present the appropriate range of difficulty for the<br />

individual or group being tested to allow the accurate scaling of the score. The CCTST family of critical thinking skills<br />

tests is designed to include a correct form of the CCTST to test strengths and weaknesses in critical thinking in a<br />

comprehensive range of individuals or groups. Contact Insight Assessment for information about selection of the<br />

most appropriate form of the CCTST.<br />

18<br />

Cronbach, Lee, Essentials of Psychological <strong>Test</strong>ing, Harper & Row 1990.<br />

19<br />

Nunnally, Jum C., Psychometric Theory, McGraw-Hill 1978<br />

Assessment and Instruction, ("The Delphi Report"). ERIC Doc. No. ED 315-423.<br />

65 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Language Translations: When an instrument is translated to other languages, methods to maintain the validity of<br />

items and scales is an integral concern. Variations in culture have required that some items be changed in the non-<br />

English language translations due to idiomatic or cultural interpretation issues, so the various language versions are<br />

not completely identical at the item level. However, much care has been taken, through collaborations with<br />

international scholars who are native language speakers, and using rigorous iterative translation procedures to<br />

assure validity, reliability and cultural competence is achieved in all authorized translations.<br />

Construct Validity<br />

Construct Validity is typically demonstrated by correlational studies where critical thinking scores are correlated with<br />

other measures that purport to include the construct. Forms of the CCTST have demonstrated strong correlations<br />

with other instruments that purport to include a measure of critical thinking or higher-order reasoning as a<br />

component of their scores or ratings. High correlations with standardized tests of college-level preparedness in<br />

higher-order reasoning have been demonstrated (GRE Total Score: Pearson r = .719, p

Comment - Age and <strong>Critical</strong> <strong>Thinking</strong>: Age is not a significant predictor of critical thinking ability in adults<br />

when educational level is controlled and when the sample is drawn from those involved in the workplace or in<br />

educational training programs. Not much is known about critical thinking skills in the general population. Children<br />

of all ages demonstrate varying ability in critical thinking. The measurement tool must be calibrated to age (grade<br />

level) for all but high performance samples in the K-12 population.<br />

Comment – Sex and <strong>Critical</strong> <strong>Thinking</strong>: There is no significant difference in scores between males and females<br />

in critical thinking skills tests distributed by Insight Assessment. This has been demonstrated in hundreds of<br />

thousands of test administrations in all types of population groups. When differences have been observed in small<br />

samples, the proportion of males and females in the sample is typically skewed due to some selection effect.<br />

Criterion (Predictive) Validity<br />

Criterion Validity is the most important consideration in the validity of a test. Criterion validity refers to the ability<br />

of the test to predict some criterion behavior external to the test itself. 22 For example, the validity of a cognitive test<br />

for job performance is the demonstrated relationship between test scores and supervisor performance ratings. In<br />

the case of the CCTST, one might want to know that it could predict some meaningful measure demonstrating the<br />

achievement of designated learning outcomes or the successful preparation and licensure of key professionals in<br />

society, or the successful transition to the workplace. Scores on the various versions of the CCTST have been<br />

demonstrated to provide this type of predictive value in peer-reviewed independent published research. The CCTST<br />

(and related critical thinking skills instruments) are cited in a large and growing literature, reflecting findings in both<br />

the United States and other nations around the world. International research often uses the CCTST (or another<br />

related critical thinking skills instrument) in authorized translations. The CCTST is a preferred measure of reasoning<br />

skills in recent US National Science Foundation (NSF) grant-funded studies of science education.<br />

Independent, peer reviewed research provides evidence of predictive validity of the CCTST (and other associated<br />

critical thinking skills measures. The following links will lead you to a website listing of published independent<br />

research documenting the criterion (predictive) validity of the CCTST, studies that use the CCTST to evaluate training<br />

techniques, to examine the achievement of learning outcomes, and to study leadership decision-making. Included<br />

in this research are doctoral dissertation studies examining critical thinking in relationship to disciplinary<br />

22<br />

Nunnally, JC and Bernstein, IH. Psychometric Theory (3rd edition), pp. 94-101. McGraw Hill, New York, 1994.<br />

67 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

training in a wide variety of disciplines. For the convenience of readers who may be reading this User Manual in<br />

23,24,25 26,27,28<br />

hard copy, some of these published studies are referenced here.<br />

Internal Consistency Reliability (KR-20,<br />

Cronbach’s Alpha)<br />

Because the CCTST, BCTST, TER, HSRT, Quant-Q, BRT, and the second parts of the LSRP, MDCTI, and TRAA are<br />

measures of cognitive skills, the discussion of the Kuder Richardson statistic applies.<br />

The appropriate internal consistency reliability coefficient for the reasoning skills instruments is the Kuder-<br />

Richardson test because scoring for these instruments is dichotomous. However, this coefficient is known to<br />

underestimate the actual reliability of the instrument when there are fewer than 50 items and when the construct<br />

being measured is not highly homogenous.<br />

KR-20’s of .70 are deemed evidence of strong internal consistency in non-homogenous measures. This level of<br />

internal consistency is the standard used for development of Insight Assessment critical thinking skills instruments.<br />

The OVERALL Scores of all distributed versions of the reasoning skills tests meet or exceed this .70 criterion in the<br />

validation samples, and in large model population samples. KR statistics in this range are typically observed in<br />

independent samples when the sample size and variance is adequate. Factor loadings for items range from .300 to<br />

.770.<br />

As we indicated in Section 4, conceptually, the traditional delineation of reasoning into deductive or inductive cross<br />

cuts the APA Delphi Report’s list of core critical thinking skills. This means that any given inference or analysis or<br />

interpretation, for example, might be classified as deductive or as inductive, depending upon how the theoretician<br />

conceives of these more traditional and somewhat contested categories. Conceptually the skills in the Delphi list are<br />

not necessarily discrete cognitive functions either, but in actual practice are used in combination during the process<br />

of forming a reasoned judgment, that is critical thinking. Although, in some contexts a given skill can be considered<br />

foremost, even though other skills are also being used. For example, a given test question may call heavily upon a<br />

test taker’s numeracy skills, while at the same time requiring the correct application of the person’s analytical and<br />

23<br />

Williams KB, Glasnapp DR, Tilliss TS, Osborn J, Wilkins K, Mitchell S, Kershbaum W, Schmidt C. (2003). Predictive validity of<br />

critical thinking skills for initial clinical dental hygiene performance. Journal of Dental Education, 67(11):1180-92.<br />

24<br />

Sorensen HA, Yankech LR. (2008). Precepting in the fast lane: improving critical thinking in new graduate<br />

nurses. Journal of Continuing Education in Nursing.<br />

Denial, A. (2008). Association of critical thinking skills with clinical performance in fourth year optometry students. Journal of<br />

Optometry Education, 33(3), 103-6.<br />

26<br />

McCall KL, MacLaughlin EJ, Fike DS, Ruiz B. (2007). Preadmission predictors of PharmD graduates' performance on the NAPLEX.<br />

American Journal of Pharmacy Education, 15; 71(1):5.<br />

27<br />

Vendrely, A. (2007). An investigation of the relationships among academic performance, clinical performance, critical thinking, and<br />

success on the physical therapy licensure examination. J Allied Health, 36(2) e108-123. See also Suckow, D.W., et. al., (2015). The<br />

association between critical thinking and scholastic aptitude on first-time pass rate of the national physical therapy examination.<br />

University of Dayton eCommons, Physical Therapy Faculty Publications, Paper 27.<br />

28<br />

O’Hare L & McGuinness C. (2015). The validity of critical thinking tests for predicting degree performance: A longitudinal study.<br />

International Journal of Educational Research.<br />

68 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

interpretive skills. For these reasons, and others relating to test design and cognitive endurance, the questions on<br />

the CCTST in its various versions, may or may not be used on more than one scale. As a result, although the specific<br />

skill scores reported have internal consistency reliability, test-retest reliability, and strong value as indicators of<br />

specific strengths and weaknesses, they are not independent factors; which is theoretically appropriate to the<br />

holistic conceptualization of critical thinking as the process of reasoned and reflective judgment, rather than simply<br />

a list of discrete skills.<br />

Because the CCTDI, CM3, BAI, and the first parts of the LSRP, MDCTI, and TRAA are measures of mindset attributes,<br />

the Cronbach’s Alpha statistic applies.<br />

Cronbach’s Alpha is the appropriate internal consistency coefficient for all measures of critical thinking and<br />

leadership mindset dispositional elements because scoring for these instruments is in Likert Format. Assessment<br />

instruments sold by Insight Assessment meet the threshold for strong internal consistency reliability (a minimum<br />

Alpha of 0.80) and are observed to maintain this performance in all samples of adequate variance.<br />

Internal consistency reliability for the seven individual scales in the initial CCTDI pilot sample ranged from .71 to .80,<br />

with the alpha for the overall instrument reaching or exceeding .91. Strong values have been observed consistently<br />

in samples collected over the past 15 years (ranging from .60 to .78 on the scales and .90 or above for the overall<br />

measure). Lower reliability coefficients are observed in samples where the variance of scores is not large.<br />

Occasionally a customer may require a calculation of the internal consistency reliability coefficient for their own<br />

sample. This is sometimes the case in research studies where the population being measured is atypical in some<br />

respect. When the sample size is adequate to support the analysis (at least several hundred participants advised),<br />

clients may request a custom analysis of the appropriate internal consistency statistic for their study sample.<br />

Additional fees apply.<br />

<strong>Test</strong>-Retest Reliability<br />

<strong>Test</strong> retest reliability for all instruments distributed by Insight Assessment meets or exceeds .80 in samples with<br />

adequate variance, retested at two weeks post pretest. Many samples demonstrate no change after far longer<br />

intervals when no active training in critical thinking has occurred between pretest and posttest. This is true for both<br />

measure of reasoning skills and mindset. No statistical evidence of an instrument effect has been observed for any<br />

instrument in internal studies of test retest reliability.<br />

We have observed that measures of critical thinking skills and mindset are very stable over time when there is no<br />

history of training in critical thinking. <strong>Test</strong> retest coefficients for both mindset and skills instruments are typically<br />

observed to meet or exceed .80 when the Time-2 administration is given two weeks after the Time-1 administration<br />

in samples, and after as long as three years after pretest where there is no on-going educational program.<br />

69 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Published Evidence of Validity and<br />

Reliability - Research Reports<br />

Previously we have listed studies that might be helpful to those designing and carrying out studies that assess critical<br />

thinking skills. This literature is now quite vast. If you are reading this User Manual as an electronic file, the following<br />

links will lead you to our website listing of published independent research documenting the criterion (predictive)<br />

validity of the CCTST, studies that use the CCTST to evaluate training techniques, to examine the achievement of<br />

learning outcomes, and to study leadership decision-making. If you are reading in hard copy, visit our website<br />

through your browser or mobile device and select the “Resources’ tab. This area of the site provides you with many<br />

teaching and learning materials for classroom and training program development and also abstracts of peer<br />

reviewed publications describing studies that evaluate training techniques, studies reporting on learning outcomes<br />

assessment, studies linking critical thinking scores to performance ratings, and studies documenting the value of<br />

critical thinking scores for admission, student retention, and predicting successful licensure in professional<br />

education.<br />

Effective instructional interventions should be expected to have a positive impact on critical thinking. Philip Abrami<br />

and several colleagues conducted a meta-analysis, published in the Review of Educational Research, which examined<br />

117 studies involving 20,298 participants. 29 They report an average positive effect size (g+) of 0.341 and a standard<br />

deviation of 0.610, with fluctuations in critical thinking effect sizes related to the type of instructional intervention<br />

and pedagogy applied. Taken together "these findings make it clear that improvement in students' critical thinking<br />

skills and dispositions cannot be a matter of implicit expectation...educators must take steps to make critical thinking<br />

objectives explicit in courses and also include them in both pre-service and in-service training and faculty<br />

development." The conceptualization of critical thinking used in the Abrami et al research is the APA Delphi<br />

construct. The same construct as is used in the development of Insight Assessment tests and measures, namely that<br />

critical thinking is the process of purposeful self-regulatory judgment focused on deciding what to believe or what<br />

to do.<br />

Gains in critical thinking skills and mindset have been reported as the result of effective training programs after as<br />

little as a few weeks but more frequently the training program has been several months or longer in duration (a<br />

college course or an employee training program, as an example). With focused and effective training techniques,<br />

these gains can be extremely significant, particularly in individuals who have not previously reflected on their<br />

reasoning process. Many longitudinal studies can be found in the peer reviewed literature documenting gains in<br />

critical thinking skills or mindset as the result of curriculum change or training programs designed for employee<br />

development. These studies have been conducted in many countries. A classic study using the CCTST was the first<br />

to demonstrate this capture of critical thinking skills gains under a wide variety of circumstances. 30 Multiple<br />

professional degree granting programs have demonstrated significant gains in critical thinking skills using site-<br />

29<br />

Instructional Interventions Affecting <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> and Dispositions: A Stage 1 Meta-Analysis. Philip C Abrami; Robert M<br />

Bernard; Evgueni Borokhovski; Anne Wade; et al, Review of Educational Research; Dec 2008; 78, 4; Research Library. pg. 1102<br />

30<br />

In the original CCTST validation study, analyses were conducted to investigate whether or not undergraduate students completing<br />

a required semester-long college course in critical thinking would show gains in critical thinking skills as compared to students who<br />

had not completed such a course. <strong>Critical</strong> thinking courses in this study had been approved as such by a university committee<br />

overseeing the critical thinking educational requirement. This research, which employed a treatment and control group design, used<br />

both the cross-sectional and the matched-pairs pretest-posttest measures. Significant gains were seen in both the cross-sectional<br />

(t=2.44, one-tailed p < .008) and matched-pairs analysis (t = 6.60, df = 231, p

specific curriculum in the professional discipline. 31 One longitudinal study documented significant gains at posttest<br />

(after a two to three months training program) and the retention of these gains one year later. 32 Many other recent<br />

and on-going studies are listed in our Resources section.<br />

In summary, testing instruments sold by Insight Assessment have met the threshold for strong internal<br />

consistency reliability (a minimum Alpha of 0.80 for attribute measures and a minimum KR-20 of .72 for skills<br />

measures) for their OVERALL Scores, and are observed to maintain this performance in all samples of adequate<br />

variance. These standards apply to published versions of the instrument in authorized translations developed in<br />

validation studies conducted in collaboration with our international scholar colleagues.<br />

31<br />

Facione, N.C and Facione, P.A. <strong>Critical</strong> <strong>Thinking</strong> Assessment in Nursing Education Programs: An Aggregate Data Analysis. A<br />

research report. Millbrae, CA: <strong>California</strong> Academic Press, 1997.<br />

32<br />

This well-designed assessment project is monitoring the effectiveness of a personnel training program designed for US Air Force<br />

personnel.<br />

71 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 5:<br />

Resources and Training<br />

Strategies<br />

This section provides some helpful information for teachers and trainers, and also<br />

conceptual information for those involved in developing learning outcomes<br />

assessment projects. If you are not reading this as a digital file, go to<br />

www.insightassessment.com/Resources to find all of these resources posted. We<br />

invite you to incorporate these links into program posts for educational purposes<br />

to help your trainees obtain the most up to date versions of these materials.<br />

Talking About <strong>Critical</strong> <strong>Thinking</strong><br />

The following collection of links will take you directly to a document that can be used in training programs<br />

and courses designed to help students and trainees become more reflective about their thinking and<br />

decision making. The documents, essays and discussions listed here are updated and new documents are<br />

added periodically. Thank you to these and many other authors for reprint permission of these materials<br />

for educational purposes. Thanks to Pearson Education for reprint permission of “Snap Judgments.” If you<br />

are not reading this as a digital file, go to www.insightassessment.com/Resources and click on the links under<br />

the heading “Talking About <strong>Critical</strong> <strong>Thinking</strong>.”<br />

<strong>Critical</strong> <strong>Thinking</strong>: What It Is and Why It Counts This essay was written and<br />

regularly updated by Dr. Pete Facione. It has been included in many training programs. At the time of publication for<br />

this 2016 manual the essay can be downloaded in English, Spanish, and Simplified Chinese.<br />

Expert Consensus on <strong>Critical</strong> <strong>Thinking</strong> Information about the APA Delphi Study that<br />

resulted in the multidisciplinary consensus definition of critical thinking, and related documents, can be found on<br />

this link.<br />

Characteristics of Strong <strong>Critical</strong> Thinkers This post lists the characteristic of strong<br />

critical thinkers identified in the APA Delphi Study. These personal characteristics have since been endorsed by<br />

educators, business professionals and civic leaders around the world.<br />

72 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Ten Positive Examples of <strong>Critical</strong> <strong>Thinking</strong> At times it is helpful to think about common<br />

examples of when critical thinking is most important. This list of ten can help trainees develop a list relevant to their<br />

own life and work.<br />

Effective Techniques for Building Reasoning <strong>Skills</strong> Training reasoning and<br />

decision making skills and helping trainees and students to develop a thinking mindset requires that the trainer use<br />

effective methods. This list of strategies provides trainers with ideas to help them reframe current teaching and<br />

training methods to make them more effective for training critical thinking.<br />

Why Measure Quantitative Reasoning (Numeracy)? Numeracy, the ability to<br />

reason in the context of numbers and proportional relationships, is very important in an increasing number of<br />

professions. A recent interest in assessing this context of critical thinking is discussed here.<br />

How are educators teaching critical thinking today? This discussion highlights<br />

the increasingly pervasive expectation that training processes and group meetings should be more analytical and<br />

reflective when group problem solving is occurring.<br />

Can critical thinking be assessed with rubrics? How can a rubric best be used to<br />

assess critical thinking? This post discussed optimal measures with rubrics and cautions on misuse.<br />

Snap Judgments PDF Do your trainees understand how humans really think? Can they see the value<br />

of training reasoning skills in the context of real life, high stakes decision making? This post is a chapter that presents<br />

an exciting discussion about heuristic reasoning from the newest version of the text Think <strong>Critical</strong>ly 2016 by Facione<br />

and Gittens.<br />

Perspectives that Influence <strong>Thinking</strong> and Knowing This tool describes seven<br />

different ways that individuals see the world. These varying perspectives have a profound effect on how a person<br />

interprets new information, identifies problems (or fails to), and determines how or whether the problems can be<br />

solved.<br />

Talking <strong>Critical</strong> <strong>Thinking</strong> PDF In this allegorical essay, which appeared in Change: The<br />

Magazine of Higher Education, we walk side by side with an academic dean who is preparing to explain to the Board<br />

of Trustees the importance of critical thinking.<br />

Tips on the Strategy of Interest-Based Negotiation This discussion locates critical<br />

thinking in the context of election politics.<br />

Facione: Essays On Liberal Learning and Institutional Budgeting<br />

Revamping courses and entire programs is an expensive project. Increasingly, clients are formulating grant proposals<br />

for federal assistance to cover the cost of training evaluation projects. Finding the resources is difficult. It is in this<br />

spirit that we include these posts.<br />

Terminology for Discussing <strong>Critical</strong> <strong>Thinking</strong> The consensus definition of <strong>Critical</strong><br />

<strong>Thinking</strong> discussed in Section 1 and derived from the APA Delphi study provides an easily accessible terminology for<br />

discussing human thinking processes and habits of mind and for communicating the importance of critical thinking<br />

73 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

in training programs. This accessible terminology is included in Table 4 of the report and appears here with reprint<br />

permission. The table can be reproduced for use in educational programs.<br />

Teaching and Training Tools<br />

These live links will connect you directly to posts about measuring critical thinking skills, mindset, leadership, and a<br />

number of other related constructs. Thank you to Dr. Carol Gittens (Santa Clara University) for permission to reprint<br />

The REWA and the Reflective Log. Thanks to the authors of “<strong>Critical</strong> <strong>Thinking</strong> and Clinical Reasoning in the<br />

Health Sciences” for their insight on best training practices. Thanks to Drs. Peter and Noreen Facione (Measured<br />

Reasons) for reprint of materials used in their training workshops. 33 Thank you to the USAF for the Performance<br />

Assessment Rubric. If you are not reading this as a digital file, go to www.insightassessment.com/Resources and<br />

click on the links under the heading “Teaching and Training Tools.”<br />

Sample <strong>Thinking</strong> <strong>Skills</strong> Questions The sample skills test questions on this page are intended<br />

to illustrate the types of questions which might appear on a generic adult level reasoning skills test.<br />

Sample Items for Measuring <strong>Thinking</strong> Attributes The sample “agree-disagree”<br />

style items on this page illustrate the types of statements that could appear on a college or adult level measure of<br />

critical thinking mindset.<br />

<strong>Critical</strong> <strong>Thinking</strong> Insight App If you are looking for a critical thinking self-test, several are<br />

available through the Insight Assessment App.<br />

Holistic <strong>Critical</strong> <strong>Thinking</strong> Scoring Rubric (HCTSR) The HCTSR is a rating<br />

measure that can be used to assess the observable critical thinking demonstrated by presentations, reports, essays,<br />

projects, classroom discussions, panel presentations, portfolios, and other ratable events or performances. The<br />

HCTSR is available for download in several languages.<br />

Professional Judgment Rating Form (PJRF) The Professional Judgment Rating Form<br />

(PJRF) was developed by our research team to make holistic assessments of critical thinking in educational and<br />

workplace settings.<br />

USAF Performance Assessment Rubric This three point rubric rates the process of<br />

problem identification and analysis as “EXCELLENT: well defined problem, SATISFACTORY: adequately defined<br />

problem’ and ‘DEFICIENT: wrong problem.<br />

Evaluating Written Argumentation (REWA) This rubric is designed to provide<br />

detailed feedback on written material intended to argue persuasively on behalf of a given claim, opinion, or<br />

recommendation.<br />

33<br />

In addition to downloads from our website, many of the essays and teaching tools listed here can also be downloaded from the<br />

authors’ academia.edu postings or from the Measured Reasons website.<br />

74 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Techniques for Trainers of Reasoning <strong>Skills</strong> and Decision Making<br />

PDF This document is a concise list of valuable training strategies. Use these techniques to strengthen the training<br />

strategies you currently use to improve thinking skills and mindset in your trainee and student groups.<br />

Reflective Log This critical thinking tool is intended to give structure and focus to journaling by students<br />

or trainees to integrate their insights about their thinking and decision making.<br />

Participant Course Evaluation Form This five-factor tool that can be used either for<br />

formative evaluation or to assist with mid-course corrections.<br />

Course Evaluation Design Discussion Questions The assessment research team at<br />

Insight Assessment offers this set of guiding questions to faculty and academic leaders seeking an effective and<br />

integrated approach to student course evaluations.<br />

<strong>Critical</strong> <strong>Thinking</strong> Exam Questions and Study Guides for High<br />

Content Courses See how “Why Correct?” and “Why Wrong?” formats convert standard content-based<br />

multiple choice items into explanations.<br />

<strong>Critical</strong> <strong>Thinking</strong> Requirement Evaluation Guidelines A set of guidelines for<br />

evaluating the position of critical thinking as an educational requirement in an institution’s general education or<br />

degree program learning outcomes.<br />

Question Asking <strong>Skills</strong>: A Leadership Training Tool Question asking is key in<br />

unfamiliar and uncertain problem situations. Building questioning skills is an important part of training thinking skills.<br />

Training Session Feedback Form This tool is intended to function as both a self-evaluation<br />

tool for the trainee and as an evaluation of the training program itself for its ability to engage the learner as intended.<br />

Completing the feedback form guides trainees to reflect specifically on their thinking experience related to the<br />

learning opportunity.<br />

Strong <strong>Critical</strong> <strong>Thinking</strong> in Groups This one page tool guides evaluation of the quality of<br />

the thinking and decision making demonstrated by the group process.<br />

The Culture of <strong>Thinking</strong> in your Organization Use this tool to assess the culture of<br />

thinking and decision-making characteristic of your organization.<br />

Designing A Study of Workplace Productivity Use this tool to infuse strong reasoning<br />

and decision making into studies of workplace conditions or as an example of how strong thinking and decision skills<br />

are embedded in each step of an a well- designed investigation.<br />

Training <strong>Critical</strong> <strong>Thinking</strong> and Clinical Reasoning The following best practices<br />

essays are excerpted from “<strong>Critical</strong> <strong>Thinking</strong> and Clinical Reasoning in the Health Sciences.” Each essay provides an<br />

example of training reasoning skills and thinking mindset described by international experts in training clinical<br />

reasoning.<br />

75 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Research Findings<br />

Research reports linking key variables to critical thinking are increasing in many disciplines. These peer reviewed<br />

publications are written by researchers in a broad range of disciplines located at institutions around the world.<br />

Each entry provides the name of the paper, the author(s), the Journal/Year of Ppublication and a brief abstract of<br />

the publication. Mini titles make it easy to determine if the paper is relevant to your current work. These abstracts<br />

are particularly useful to those preparing dissertation studies and proposals for grant funding and federal support.<br />

At the time of this publication more than 80 research abstracts are identified for these five topics to facilitate the<br />

sharing of these findings. We update the list continuously and encourage researchers to send us notification of their<br />

peer reviewed publications. While many of these studies were conducted using instruments in language translations<br />

other than English, currently our lists are limited to English language publications. If you are not reading this as a<br />

digital file, go to www.insightassessment.com/Resources and click on the links under the heading “Research<br />

Findings.”<br />

Evaluating Training Techniques This link connects you to a collection of studies describing<br />

and evaluating a variety of training techniques for evidence that they effectively train critical thinking skills or<br />

mindset.<br />

Learning Outcomes Assessment This link connects you to a collection of studies reporting the<br />

outcome of assessment projects. Papers from general education projects, STEM education studies, health sciences<br />

training projects, and business education curriculum evaluation projects, are included.<br />

Admissions, Retention, and Licensure These peer reviewed reports link critical thinking<br />

scores to professional licensure exam performance and other indicators of student success.<br />

Performance Ratings Increasingly, strength in critical thinking factors into workplace assessment.<br />

These studies demonstrate that critical thinking scores are predictive of employer and preceptor ratings.<br />

Leadership, <strong>Skills</strong> and Mindset What are the characteristics desired in leaders and decision<br />

makers? This collection of papers examines some potential relationships.<br />

76 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Quotes about <strong>Thinking</strong> Courageously and Well<br />

“The unexamined thought is not worth thinking” Pat Croskerry, M.D.<br />

“A mind stretched by a new idea never goes back to its original dimensions.”<br />

Oliver Wendell Holmes<br />

“Fix reason in her seat, and call to her tribunal every fact, every opinion. Question<br />

with boldness…” Thomas Jefferson<br />

“Nothing in all the world is more dangerous than sincere ignorance and<br />

conscientious stupidity.” Martin Luther King, Jr.<br />

“The illiterates of the 21 st century will not be those who cannot read and write, but<br />

those who cannot learn, unlearn, and relearn.” Alvin Toffler<br />

“People can be extremely intelligent, have taken a critical thinking course, and<br />

know logic inside and out. Yet they may just become clever debaters, not critical<br />

thinkers, because they are unwilling to look at their own biases.” Carol Wade.<br />

“<strong>Critical</strong> thinking is skeptical without being cynical. It is open-minded without<br />

being wishy-washy. It is analytical without being nitpicky. <strong>Critical</strong> thinking<br />

can be decisive without being stubborn, evaluative without being judgmental and<br />

forceful without being opinionated.” Peter Facione<br />

“The important thing is never to stop questioning.” Albert Einstein<br />

“The first thing that parents can do is, in accordance with the child’s age,<br />

temperament, and capacity, explain, explain, explain. Give reasons for decisions<br />

and punishments.” Carol Tavris<br />

“I am convinced that what we believe has to be able to stand the test of evaluation.<br />

For example, the idea that teaching should be value free doesn’t make sense.” John<br />

Chaffee<br />

77 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Section 6:<br />

Customer Relationship<br />

This section provides important legal messages and notifications<br />

pertaining to the use of Insight Assessment test instrument use<br />

licenses, including the fundamental agreement for the use of<br />

testing licenses, non-disclosure and non-compete agreement,<br />

buyer qualification, privacy, data security, instrument protection,<br />

disability accommodation, and copyrights.<br />

Use Licensure Agreement:<br />

Insight Assessment provides assessment solutions which are single, time-limited instrument use<br />

licenses and the related scoring and score reporting services.<br />

<strong>Test</strong> use licenses, whether for electronic testing or paper and pencil administration, expire 1 year (12 months) after<br />

the date of purchase as shown on the invoice. One license to use a test by any means provided, whether by paper<br />

or via electronic gateway, (e.g. the Insight Assessment website interface, client LMS, or Insight Assessment personal<br />

device app) permits one individual to be tested one time with one test. An electronic test license is considered to<br />

have been used when the test questions are downloaded, even if the test taker does not elect to submit responses<br />

for scoring. A license for paper-and-pencil administration includes the one time use of a test booklet and a<br />

CapScore answer form, and is considered to have been used when an answer form is marked and returned for<br />

scoring.<br />

Booklets and answer forms and all related delivery solutions are proprietary business properties of Insight<br />

Assessment. All clients and their test takers agree to insure the security of testing instruments, whether administered<br />

and through paper-and-pencil or online. Client and client's test takers agree not to reproduce, copy, replicate, image,<br />

or publish in any way any testing instrument in whole or in part, to prohibit other from so doing, and to protect and<br />

defend copyright as indicated on Insight Assessment testing materials and online interface. Once used paper testing<br />

booklets should be destroyed or returned to Insight Assessment, and answer forms should be sent to Insight<br />

Assessment for scoring.<br />

Use licenses apply only to Insight Assessment testing materials and their Insight Assessment authorized translations.<br />

Insight Assessment reserves the right, at its sole discretion and without the obligation of prior notice or explanation,<br />

not to honor requests to purchase licenses to use its instruments and the right to refuse requests to process scoring<br />

or to report scores, regardless of the source of such requests. Customer purchase order or valid credit card must be<br />

on file for non-prepaid orders. Customer acknowledges, agrees, and authorizes Insight Assessment to charge<br />

78 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

customer's on file credit card without additional notification for the full balance due on customer's past due invoices.<br />

All sales are final.<br />

Scoring is done by Insight Assessment only. The only agency authorized to score and to report scores on<br />

Insight Assessment instruments is Insight Assessment. Item level information and response keys are proprietary.<br />

Assessments must be administered in accord with the instructions provided in the instrument’s User Manual. Insight<br />

Assessment reserves the right not to score data derived from assessment uses not properly administered. Client<br />

understands that scales structures and item responses are not reported. Do not duplicate paper answer forms;<br />

duplicated answer forms will not be scored. Although the scoring and results are provided by Insight Assessment, all<br />

clients and their test takers acknowledge that the interpretation and use of those scores and results is solely the<br />

responsibility of the client.<br />

Ownership and Use of Data Collected and Stored Data in the Online <strong>Test</strong>ing System or Scores<br />

through the CapScore System: Data stored in the online system or CapScore system is owned by Insight<br />

Assessment / the <strong>California</strong> Academic Press. Online testing system data are released only to the client who<br />

contracted to collect or who collected these data, not to any other individual or agency except as may be required<br />

by law. It is the client's right to download copies of these data for their use. Scored CapScore data will be<br />

transmitted only to the client or the agent designated by the client on the written CapScore Return Form in the<br />

format requested (hard copy or electronic files – additional fees may apply to requests for hard copy reports). The<br />

responsibility for privacy protection of data downloaded by clients from the Insight Assessment /the <strong>California</strong><br />

Academic Press testing system or returned as digital files from the CapScore system rests entirely with the client.<br />

Additional requests by the client for analyses of stored data and the retransmission of these data or these analyses<br />

to clients must be prearranged by written request from an authorized individual client or client agency.<br />

Aggregate Information: Although Insight Assessment / The <strong>California</strong> Academic Press publishes aggregate<br />

comparison percentiles for its products, it does not disclose personal information associated with any dataset<br />

involved in this process. CapScore clients are permitted access only to data collected at their own agency by<br />

themselves. Data collected by two separate clients at the same agency must be shared by the agency itself. Insight<br />

Assessment / the <strong>California</strong> Academic Press does not undertake consolidation of data even when requested by one<br />

of the agency clients unless the request comes from all clients at the agency. This service would constitute a separate<br />

contract and require permissions from all involved clients.<br />

Non-Disclosure & Non-Compete Agreement: By accessing the Insight Assessment online testing interface<br />

or purchasing a preview pack or instrument use licenses, all clients acknowledge that the online interface and the<br />

testing instrument(s) it contains or displays include proprietary business information, such as but not limited to the<br />

structure of test questions or the presentation of those questions and other information displayed in conjunction<br />

with the use of this testing interface. In the absence of a specific written agreement between the client and Insight<br />

Assessment, the client agrees that by purchasing a preview pack or testing licenses, the client and their organization,<br />

shall not disclose, copy, or replicate this testing interface or this testing instrument(s) in whole or in part in<br />

comparable or competitive product or interface of any kind. In the absence of a specific written agreement between<br />

the client and Insight Assessment, the client agrees that by accessing the testing instrument(s) for any purpose,<br />

including but not limited to previewing the instrument(s), the client and the client’s organization shall not create,<br />

design, develop, publish, market, or distribute any comparable or competitive testing instrument(s).<br />

79 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Privacy Policy<br />

Insight Assessment, a division of the <strong>California</strong> Academic Press, endorses the standards of the American<br />

Psychological Association in the use of individual difference data gathered during the testing of human subjects.<br />

Toward this end, and to maximize the ethical use of data collected as a function of the use of our products, we<br />

require purchasers of our products to provide evidence of their qualification as a test administrator. Qualification<br />

includes but is not limited to persons who hold jobs requiring testing activity: human resource professionals, staff<br />

development professionals, teachers, university faculty, researchers, counselors, clinical assessment staff,<br />

institutional research staff, etc.<br />

Graduate students who wish to purchase products from Insight Assessment / the <strong>California</strong> Academic Press must<br />

provide information regarding the testing credentials of their advisors as well as the assurance from advisors that<br />

the testing and data management will be in accordance with the ethical treatment of human subjects and protection<br />

of data. If more information is needed about purchaser qualifications for our products, please contact Insight<br />

Assessment / the <strong>California</strong> Academic Press, 1735 N 1 st Street, Suite 306, San Jose, CA 95112-4529, USA.<br />

[email protected] or use our ‘Contact Us’ window on the website. Insight Assessment reserves the right<br />

to discontinue services or decline purchase requests in the event that there is reason to be concerned that doing<br />

otherwise would compromise the security of a measurement tool.<br />

The protection of personal privacy related to the action of collecting data or using individual or summary data by<br />

clients is the responsibility of the client, whether they are individuals or agencies. Insight Assessment / The <strong>California</strong><br />

Academic Press does not assume responsibility for the maintenance of personal privacy related to the actions of<br />

clients in the collection, storage, or use of data derived from the use of our products.<br />

Our privacy policy prevents publication of the actual names of institutions whose data are included in each<br />

comparison percentile project. Within the aggregated dataset, samples are from liberal arts colleges, land grant<br />

institutions, technology institutes, community and technical colleges, schools of the arts, research intensive<br />

universities, professional preparation programs, and other educational institution types. The samples are drawn<br />

from small and large colleges and universities, both public and private. In the case of the profession specialized<br />

versions, such as the HSRT or the BCTST, norm samples also strive to represent the various disciplinary areas as<br />

possible.<br />

Access to Client List Information: Insight Assessment / the <strong>California</strong> Academic Press considers its client list<br />

to be proprietary and private and does not release lists of clients to any other individual or agencies except as may<br />

be required by law. It does not provide lists of clients to buyers or to prospective or current clients. Disclosure of<br />

stored client account information to the client themselves is permitted for the purposes of assuring accurate record<br />

keeping.<br />

80 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Data Security<br />

In keeping with the concerns about the impact of the<br />

expansion of electronic networks on information privacy,<br />

and congruent with the ethical considerations of research on<br />

human subjects, Insight Assessment / The <strong>California</strong><br />

Academic Press maintains strict adherence to the protection<br />

of all personal data analyzed and stored as a function of<br />

purchasing our products. This document outlines the policies<br />

and procedures for the protection of personal data related<br />

to the purchase of products by our clients, the integrity of<br />

the online testing system data, and the transmission of<br />

CapScore test taker data to clients, whether these clients<br />

are individuals or agencies.<br />

Assessment construction and validation is conducted under the highest psychometric standards of each type of<br />

instrument. General psychometric information is reported in each assessment manual. Limited psychometric<br />

information is available to clients for research support purposes. Translation colleagues are provided with<br />

psychometric analyses of their validation sample datasets.<br />

Notice: For assessment security reasons, information on assessment item construction, instrument structure and<br />

scale structure are not released as a component of testing service purchase.<br />

Protection of <strong>Test</strong>ing Data Collected through any Insight Assessment / the <strong>California</strong> Academic<br />

Press Online or Electronic <strong>Test</strong>ing Systems or Data Scored and Stored Through the CapScore<br />

System: Access to a client's electronic data bank is protected with a client login and password system. Access to<br />

the client database is therefore limited to persons at the client organization who have knowledge of the login and<br />

password information for the client account, and to technical staff at Insight Assessment / the <strong>California</strong> Academic<br />

Press, who provide technical support to the client in their use of the online testing system or other Insight<br />

Assessment data collection or electronic testing systems. The protection of the Account Administrator login and<br />

password is the responsibility of the client. Changes to the client login and password code can be made by the client<br />

at any time for added security to the data stored in the client account. Protection of these codes, and thus the<br />

security of access to data in the client account, is the responsibility of the client. Insight Assessment / the <strong>California</strong><br />

Academic Press does due diligence in the protection of access to its database servers, and maintains the highest<br />

standards in data protection.<br />

Attention to data security also pertains to data collected or transmitted when: 1) establishing client<br />

accounts for the purchase of products; 2) processing bank transfers, checks, electronic payments, or purchase<br />

orders; 3) providing test taker access to products in electronic or paper-and-pencil format; 4) delivering testing to<br />

clients and test takers of clients; 5) the shipment of products; 6) transmission of scores; 7) return of scored<br />

assessment forms to clients; 8) assistance to test takers using the online testing system; 9) assisting clients to manage<br />

the online testing system.<br />

81 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Instrument Protection<br />

Insight Assessment is a distributor of time-limited licenses to use high-quality tests and surveys and is committed to<br />

delivering those tests and surveys in an accurate and timely manner. Insight Assessment is also committed to<br />

providing clients with accurate reports of scores and other demographic data and survey results as entered by test<br />

takers in response to the questions on their assigned tests or surveys. The following notices are hereby published<br />

and made public as inclusive with the purchase of licenses to use Insight Assessment tests.<br />

Notice: Customers who purchase use licenses and testing services assure Insight Assessment that:<br />

Provisions will be made to guarantee instrument security. All copies of paper-and-pencil measurement tools<br />

will be kept in secure locations before and after testing.<br />

Online Login and Password information will be kept in secure locations before and after testing.<br />

Damaged or unusable paper-and-pencil assessment instruments will be shredded or disposed of in ways<br />

that do not compromise instrument security.<br />

Client acknowledges that Insight Assessment retains ownership of paper assessment booklets and online<br />

tests which are shipped to or used by the client testing.<br />

Copyrights will not be violated. No portion of any assessment may be reproduced in any format without<br />

the specific permission of Insight Assessment.<br />

No copies of the assessment, in whole or in part, will be made. No assessment questions will be duplicated<br />

or distributed or published in any way that has the potential to compromise the security of the testing<br />

instrument.<br />

Neither the tests nor the assessment results are sold or made available for sale in the absence of a specified<br />

reseller contract.<br />

Assessments will be administered in accord with the procedures specified in this assessment manual.<br />

Assessment instruments and results will be used in accord with accepted ethical standards for the testing<br />

of human subjects.<br />

Assessment data, score interpretation, and use are the sole responsibility of the client and end-user.<br />

Notice: Insight Assessment reserves the right to withhold testing and scoring services or assessment score results<br />

for any client, test taker, or other user who does not abide by the terms and conditions of this purchase agreement,<br />

presents invalid CapScore response forms for scoring, misrepresents or falsifies responses to assessment<br />

questions, attempts to acquire or to distribute assessment questions or scales or answer keys, or who otherwise<br />

violates any of these terms or conditions. Insight Assessment reserves the right to discontinue services or decline<br />

purchase requests in the event that there is reason to be concerned that doing otherwise would compromise the<br />

security of any of its measurement tools.<br />

Notice: Insight Assessment is not responsible for verifying either the identity of the test taker or the truthfulness or<br />

accuracy of demographic or other information as entered by test takers. Insight Assessment is not responsible for<br />

the conditions under which tests are administered or taken. Detailed instructions for test administration are<br />

published in the assessment manuals which are part of each assessment's preview pack.<br />

Notice: Purchasers and users of Insight Assessment measurement tools incur by virtue of their purchase of said tools<br />

certain obligations and responsibilities, which are specified on our website. See www.insightassessment.com<br />

Notice: Clients are solely responsible for the transmittal of the original CapScore response forms for scoring. It is<br />

recommended that clients retain copies of these forms to protect against loss of CapScore response forms during<br />

shipment for scoring.<br />

82 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Notice: Clients are solely responsible for storage of returned CapScore results, data, and datasets. If requested,<br />

Insight Assessment will search its CapScore data archives and retrieval datasets for clients. Contact Insight<br />

Assessment for cost information.<br />

Notice: Although all reasonable precautions are taken to ensure assessment security, Insight Assessment is not<br />

responsible for violations of copyrights or for the unauthorized distribution of any version of any assessment, or for<br />

any information about any assessment, whether this information be accurate or inaccurate. Insight Assessment will<br />

vigorously prosecute the unauthorized publication of information about any version of any of its tests to the full<br />

extent of the law.<br />

Notice: Clients and test takers are solely responsible for the interpretation and use of assessment scores, survey<br />

results, demographic information, or related descriptive information or statistical analyses as may be gathered,<br />

displayed, downloaded, or otherwise generated by Insight Assessment. User clients and user test takers agree to<br />

hold Insight Assessment, its officers and staff members, individually and jointly, harmless of any liability or<br />

responsibility, financial, personal, or otherwise, for the use or the misuse of any Insight Assessment gathered,<br />

displayed, analyzed, downloaded, or otherwise generated assessment scores, survey results, demographic<br />

information, or related descriptive information. See www.insightassessment.com<br />

Notice: Except for prepaid testing contracts and unexpired quotes, prices are subject to change without notice.<br />

Notice: Clients are hereby notified that the <strong>California</strong> Academic Press LLC, d.b.a. Insight Assessment owns the data<br />

derived from its online e-testing system and from CapScore and uses those data for research purposes such as the<br />

development comparison percentiles for various groupings of institutional types on various testing instruments, e.g.<br />

"US undergraduate student percentiles on the BCTST." Insight Assessment is committed to maintaining individual<br />

and client confidentiality of E-<strong>Test</strong>ing and CapScore data and datasets.<br />

Sale of Use Licenses for <strong>Test</strong>ing to Qualified Buyers Only: For reasons of assessment security and to<br />

protect the testing programs of our institutional and industry clients, we restrict purchase of certain of our testing<br />

licenses to qualified professionals and researchers. Employers, human resource directors, assessment directors,<br />

university academic administrators, professors, teachers, personnel trainers, staff development professionals, and<br />

their agents may purchase measurement tools published by Insight Assessment. Sales staff may require evidence<br />

of credentials for access to assessment license purchase. See additional notifications relating to the sale and use of<br />

testing instrument licenses below.<br />

Notice: Insight Assessment reserves the right not to sell to qualified individuals who are enrolled in degree programs<br />

which use these tools for high-stakes testing purposes or who are employees of client organizations that use our<br />

tools for high-stakes testing purposes.<br />

Notice: Insight Assessment does not sell or supply testing materials for the purpose of psychometric analysis,<br />

validation of competing instruments, or as specimens in research methods or instrument development courses.<br />

Notice: Insight Assessment expects that clients will adhere to ethical standards for the use and interpretation of<br />

standardized tests.<br />

Notice: Insight Assessment reserves the right, at its sole discretion and without the obligation of prior notice or<br />

explanation, not to honor requests to purchase and requests to process scoring or report scores on its instruments<br />

regardless of the source of such requests.<br />

Notice: In all testing situations, compliance with the requirements of the Americans with Disabilities Act is solely the<br />

responsibility of the purchaser of tests and testing services.<br />

83 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Notice: Doctoral Dissertation students may receive access to purchase test use licenses from Insight Assessment.<br />

Access to Insight Assessment testing instruments is made available to doctoral level students conducting dissertation<br />

research after application for this access is made to the company. Doctoral students are afforded a large research<br />

discount covered by internal grant funds. Supervision of these student researchers is the responsibility of sponsoring<br />

advisors. Any violation of the purchase contract is the responsibility of the supervising professor and the degree<br />

granting institution.<br />

Notice: Dissertation Advisors / Committee Review or IRB Committee Review -- IRB committee review of instruments<br />

can be provided through the online testing system only if protection of the instrument is assured by the dissertation<br />

director. Instrument validity and reliability precludes any editing or deleting of individual questions or assessment<br />

items.<br />

Notice: When filing a dissertation: It is NOT permitted to file a copy of any Insight Assessment instrument along with<br />

the dissertation unless the library will assure, in writing and under signature of the library director, that they will not<br />

distribute this copy as a part of the dissertation (neither by digital scan nor by hard copy) even if it is requested by<br />

other scholars. To include one of our testing instruments or any of the items it contains in a copy of a dissertation<br />

which is then distributed to a third party is both a violation of copyright and a violation of the contractual agreement<br />

for protection of the instrument that is assumed by the user at purchase.<br />

Notice: No actual assessment questions or assessment items from the instrument may be included within any<br />

research report of studies that use one or more of the Insight Assessment testing instruments. The inclusion of any<br />

actual assessment items in a publication is both a violation of copyright and a violation of the contractual agreement<br />

for protection of the instrument that is assumed by the user at purchase. When discussing the instrument in your<br />

published paper, you may include verbatim excerpts from the assessment manual, properly cited. You may also use<br />

one or more of the example items posted on our website to help your reviewers and readers understand the nature<br />

of the instrument.<br />

Disability Accommodation<br />

Decisions about how to accommodate test takers with documented disabilities are the sole responsibility of the<br />

purchaser. Insight Assessment is not aware of any limitation in the accommodations required for disabilities and<br />

should be notified should the purchaser discover any.<br />

Suggested strategies for the accommodations of test takers with disabilities include, but are not limited to, allowing<br />

the test taker additional time to complete the assessment and reading assessment questions to the test taker. If a<br />

reader is used, the reader should not attempt to interpret, explain, or clarify assessment items.<br />

To the extent that the documented disability involves cognitive processes needed for problem analysis and problem<br />

solving, there is no appropriate accommodation as the tests themselves are designed to be a measure of relative<br />

performance in this regard.<br />

The accommodation of individuals with disabilities does not relieve the customer of the terms and conditions in the<br />

sales contract relating to the security of testing instruments: no duplication, capture, copying, digitalization, capture<br />

of assessment items as part of an accommodation, nor administration of the assessment through software programs<br />

that are not a part of the Insight Assessment testing systems is permitted. No tests may be transferred to other file<br />

forms or modalities.<br />

84 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

Copyright Notices<br />

Notice: All persons, including purchasers, reviewers, client users, and test<br />

taker users are hereby notified that Insight Assessment tests, testing<br />

materials, computer-based assessment and survey software, and the testing<br />

or survey content delivered in paper form or by means of this software are<br />

protected by copyright law and international treaties. All copyrights<br />

worldwide are reserved. Insight Assessment E-<strong>Test</strong>ing software, and its code,<br />

images, and web content are protected by international copyright (c) 2002,<br />

2006, 2008, 2009, 2010, 2012, 2015, 2016 held by the <strong>California</strong> Academic<br />

Press LLC, d.b.a. Insight Assessment. All rights worldwide are reserved.<br />

Notice: Insight Assessment CapScore response forms, materials and services, and Insight Assessment's Online E-<br />

<strong>Test</strong>ing software, and its code, images, and web content are protected by international copyrights and renewals (c)<br />

1986, 1990, 1992, 1994, 1998, 2002, 2006, 2008, 2010, 2012, 2013, 2014, 2015, 2016 held by the <strong>California</strong> Academic<br />

Press d.b.a. Insight Assessment and the authors of the various tests and assessment instruments. Insight Assessment<br />

/ the <strong>California</strong> Academic Press publishes tests, surveys and their associated questions, items, and response forms<br />

in English and in all translations, all of which are covered by various US and worldwide copyrights held by the<br />

assessment authors.<br />

Notice: Instruments online or as paper booklets, assessment questions, assessment items, answer options, and<br />

answer forms may not be duplicated, copied, replicated, digitized, posted, displayed or reproduced in whole or in<br />

part in paper or electronically, or by any means without the prior written consent and agreement of the copyright<br />

holders, author(s), and publisher. Assessment items may not be posted on either the public or any internal<br />

Internet. By purchasing assessment use licenses and their related assessment scoring and score reporting services<br />

buyer certifies that he/she has read, understands, and accepts the notices, advisories, and the purchaser<br />

requirements as indicated at the Insight Assessment website, the instrument manual, the price quote, and the sales<br />

order. The buyer and buyer's organization agree that by accessing Insight Assessment testing instrument(s) for any<br />

purpose, including but not limited to previewing the instrument(s), neither the buyer nor the buyer's organization<br />

shall create, design, develop, publish, market, or distribute any comparable or competitive testing instruments for a<br />

period of up to four years from the date of the most recent access to digital or paper versions of any Insight<br />

Assessment testing instrument(s).<br />

Notice: All persons are hereby notified that any unauthorized reproduction or distribution of any part of this e-<br />

testing software, or any assessment or survey, or any response form delivered or made visible in paper, electronic,<br />

Internet, PDF, email, or in any media by any means whatsoever or any question or answer, or any part thereof, is a<br />

violation of said copyright or copyrights and may result in severe civil and criminal penalties. Insight Assessment will<br />

defend its copyrights and the copyrights of its assessment authors vigorously, and will prosecute violators to the<br />

maximum extent under the law.<br />

Notice: By accessing the Insight Assessment e-testing system, user acknowledges that this computer-based<br />

assessment/survey software and the testing or survey content delivered by means of this software is protected by<br />

copyright law and international treaties. User agrees that reproduction or distribution of any part of this e-testing<br />

software or any assessment or survey delivered or made visible in any media by means of this software, or any<br />

question or answer, or any part thereof, is a violation of said copyright or copyrights which may result in severe civil<br />

and criminal penalties, and will be prosecuted to the maximum extent possible under the law.<br />

85 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights<br />

  • Recommendations

<strong>California</strong> <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong> The Gold Standard <strong>Critical</strong> <strong>Thinking</strong> <strong>Skills</strong> <strong>Test</strong>s for Undergraduate and Graduate Programs User Manual and Resource Guide Published by Insight Assessment, a Division of the <strong>California</strong> Academic Press Phone: (650) 697-5628 www.insightassessment.com © 2016 <strong>California</strong> Academic Press, San Jose, CA. All rights reserved worldwide. 1 CCTST User Manual and Resource Guide © 2016 Insight Assessment / The <strong>California</strong> Academic Press. San Jose CA. All rights reserved worldwide.

  • Page 2 and 3: Willing and Able to Think Criticall
  • Page 4 and 5: Ethics of Performance Testing Use,
  • Page 6 and 7: Proctor Instructions: Pencil-and-Pa
  • Page 8 and 9: Section 1: Critical Thinking: This
  • Page 10 and 11: We also provide independent languag
  • Page 12 and 13: Scores Reported OVERALL Score is th
  • Page 14 and 15: might be mistaken even though the e
  • Page 16 and 17: using spatial reasoning, interpreti
  • Page 18 and 19: Students and workers with weak crit
  • Page 20 and 21: students and practicing professiona
  • Page 22 and 23: This work subsequently provided a f
  • Page 24 and 25: Section 2: Administration Options T
  • Page 26 and 27: growth in overall mean scores and t
  • Page 28 and 29: Selecting an Administration Method:
  • Page 30 and 31: Online Administration Getting Start
  • Page 32 and 33: Step 3) Give the individuals you pl
  • Page 34 and 35: Proctor Instructions: Online Assess
  • Page 36 and 37: Test takers must also darken the bu
  • Page 38 and 39: 4. The testing session is timed. Fo
  • Page 40 and 41: Section 3: Results Reported This se
  • Page 42 and 43: Figure 1: Sample Individual Test ta
  • Page 44 and 45: Interpreting Spreadsheet Score Repo
  • Page 46 and 47: Step 1: Interpret Each Individual
  • Page 48 and 49: Table 3 displays the Recommended Pe
  • Page 50 and 51: Table 5: Example of Scale Score Int

Interpreting Group Score Reports In

Visit YouTube to view our video on

Step 4: Interpret the Group’s Sca

Figure 5: Distributions of Scale Sc

Difference Scores - Numerical and R

cognition, so the observation of a

Section 4: Validity & Reliability T

Language Translations: When an inst

training in a wide variety of disci

Published Evidence of Validity and

Section 5: Resources and Training S

in training programs. This accessib

Research Findings Research reports

Section 6: Customer Relationship Th

Privacy Policy Insight Assessment,

Instrument Protection Insight Asses

Notice: Doctoral Dissertation stude

Extended embed settings

Inappropriate

You have already flagged this document. Thank you, for helping us keep this platform clean. The editors will have a look at it as soon as possible.

Mail this publication

Delete template.

Are you sure you want to delete your template?

DOWNLOAD ePAPER

This ePaper is currently not available for download. You can find similar magazines on this topic below under ‘Recommendations’.

Save as template?

logo

  • Help & Support
  • tuxbrain.com
  • ooomacros.org
  • nubuntu.org
  • Terms of service
  • Privacy policy
  • Cookie policy
  • Cookie settings

california critical thinking skills test free

Choose your language

Main languages

Further languages

  • Bahasa Indonesia

Performing this action will revert the following features to their default settings:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!

  • Practice Tests
  • Predictive Index
  • Firefighter
  • Hogan Assessments
  • Leadership Assessment
  • Ramsay Technician Assessments
  • Watson-Glaser
  • Raven's Progressive Matrix
  • NEO Personality Inventory
  • Texas Success Initiative
  • TSA Prep Booster™ Course
  • TSA Practice Test
  • TSA Written Skills Assessment
  • TSA CBT X-Ray Object Recognition Test
  • TSA Connect the Dots
  • SHL Assessment Prep Course
  • Practice Test & Answers
  • SHL Practice Tests
  • SHL Test Answers
  • SHL Inductive Reasoning Test
  • SHL Numerical Reasoning Test
  • SHL Verbal Reasoning Test
  • SHL Verify G+ Test
  • SHL Mechanical Comprehension Test
  • SHL Situational Judgment Test
  • SHL OPQ Personality Test
  • Predictive Index Master (Cognitive & Behavioral)
  • Predictive Index Cognitive Assessment
  • Predictive Index Behavioral Assessment
  • Predictive Index Practice Test
  • Predictive Index Results
  • Caliper Course
  • Caliper Test Prep With Real Practice Test
  • USPS Postal Exam
  • Postal Exam 474
  • Postal Exam 475
  • Postal Exam 476
  • Postal Exam 477
  • USPS Postal Exam Prep
  • Pass the 2024 Postal Exam With Practice Tests
  • Virtual Entry Assessment (VEA)
  • General Police Prep Course
  • Police Situational Judgement Test
  • Police Psychological Exam Course
  • Massachusetts State Police Exam
  • Pennsylvania Police Exam
  • Philadelphia Police Exam
  • Nassau County Police Exam Course
  • Suffolk County Police Exam
  • Correctional Officer Exam
  • MTA Police Exam
  • New York State Police Exam Prep Course
  • School Safety Agent Course
  • Police Officer NYPD Exam
  • Police Fitness Prep Course
  • Exam Formats
  • EB Jacobs Law Enforcement Aptitude Battery
  • CJBAT Study Guide
  • DELPOE Police Exam
  • Texas LEVEL Test With Expert Guides
  • PELLETB Course
  • FBI Test Phase 1 (Special Agent Exam): Guide with Practice Test [2024]
  • Police Test Preparation Suite
  • Pass a Polygraph Test (Lie Detector): Expert Tips & Questions – 2024
  • Firefighter Test
  • FDNY Firefighter Prep Course
  • Firefighter Psych Test
  • NFSI Firefighter Prep Course
  • FCTC Firefighter Prep Course
  • Firefighter Aptitude and Character Test
  • FireTeam Prep Course
  • Master Course
  • Hogan Assessments Master Course
  • Personality Courses
  • Hogan Personality Inventory (HPI)
  • Hogan Development Survey (HDS)
  • Hogan Motives, Values & Preferences Inventory (MVPI)
  • Busines Reasoning Course
  • Hogan Business Reasoning Inventory (HBRI)
  • Leadership Assessment Test
  • GardaWorld Pre Board Primer
  • Bennett Mechanical Comprehension Test II (BMCT-II) Success Prep Course
  • Beat the 2024 BMCT With Industry Expert Guides & Realistic Practice Tests
  • 911 Dispatcher
  • Exam Format
  • Criticall Dispatcher
  • Criticall Dispatcher Test
  • Criteria Cognitive Aptitude Test - CCAT Course
  • Universal Cognitive Aptitude Test - UCAT Course
  • CCAT Practice Test
  • Criteria Pre-employment Testing: Personality, Aptitude & Skill Tests
  • Korn Ferry Course
  • Ace the 2024 Korn Ferry Assessment With Practice Test & Expert Guides
  • Ramsay Electrical Assessment
  • Ramsay Maintenance Assessment
  • Ramsay Mechanical Assessment
  • Ramsay Multicraft Assessment
  • Ramsay Electrical Practice Test
  • Ramsay Maintenance Practice Test
  • Ramsay Mechanical Practice Test
  • Ramsay Multicraft Practice Test
  • Ramsay Test Prep
  • AFOQT Study Guide
  • ASTB Study Guide
  • SIFT Study Guide
  • Watson-Glaser Critical Thinking Course
  • Beat the Watson Glaser and Upgrade Your Career
  • Take on the Watson Glaser and Secure your Future Career
  • Raven's Advanced Progressive Matrices
  • Texas Success Initiative Course
  • TSI Practice Test 2024: Math, Reading & Writing
  • TSI Reading Practice Test: 15 Q&A with Explanations
  • Pass our Free TSI Math Practice Test (2024 Update)
  • Take our Free TSI Writing Practice Test (2024)
  • How it Works

Critical Thinking Test: Sample Questions with Explanations (2024)

Employers value and seek candidates who demonstrate advanced critical thinking skills. They often administer critical thinking tests as part of their hiring process. Critical thinking tests can be very difficult for those who don’t prepare. A great way to start practicing is by taking our critical thinking free practice test.

What Does The Critical Thinking Test Include?

The Critical Thinking Test assesses your capacity to think critically and form logical conclusions when given written information. Critical thinking tests are generally used in job recruitment processes, in the legal sector. These tests measure the analytical critical thinking abilities of a candidate.

Why Is Critical Thinking Useful?

Critical thinking is put into action in various stages of decision-making and problem-solving tasks:

  • Identify the problem
  • Choose suitable information to find the solution
  • Identify the assumptions that are implied and written in the text
  • Form hypotheses and choose the most suitable and credible answers
  • Form well-founded conclusions and determine the soundness of inferences

What is Watson Glaser Test and what Critical Thinking Skills it Measures?

The most common type of critical thinking test is the Watson-Glaser Critical Thinking Appraisal (W-GCTA). Typically used by legal and financial organizations, as well as management businesses, a Watson Glaser test is created to assess candidates’ critical thinking skills.

The test consists of 10 questions to be answered in 10 minutes approx (although there is no timer on the test itself). Our test is slightly harder than the real thing, to make it sufficiently challenging practice.

You need to get 70% correct to pass the test. Don’t forget to first check out the test techniques section further down this page beforehand.

Questions          25

Pass percentage          70%.

The test is broken down into five central areas:

  • Assumptions
  • Interpretation

Critical Thinking Course

  • 1 BONUS Interview Prep Video Guide Buy this Course: Get full access to all lessons, practice tests and guides.

The Five Critical Thinking Skills Explained

1. recognition of assumption.

You’ll be presented with a statement. The statement is then followed by several proposed assumptions. When answering, you must work out if an assumption was made or if an assumption was not made in the statement. An assumption is a proclamation that an individual takes for granted. This section of the tests measures your ability to withhold from forming assumptions about things that are not necessarily correct.

  • 1: Assumption Made
  • 2: Assumption Not Made

Although the passage does state that Charlie’s fundraising team is doing its best so that the charity event can meet its goal, nowhere did it state that their team is leading the event.

2. Evaluation of Arguments

You will be presented with an argument. You will then be asked to decide whether the argument is strong or weak. An argument is considered strong if it directly connects to the statement provided, and is believed to be significant.

No, participation awards should not be given in every competition because studies have shown that this would cause the participants to put in less effort because they will get a prize no matter what the outcome is.

  • 1: Strong Argument
  • 2: Weak Argument

This is a strong argument as it provides evidence as to why participation awards should not be given in every competition

3. Deductions

In deduction questions, you will need to form conclusions based solely on the information provided in the question and not based on your knowledge. You will be given a small passage of information and you will need to evaluate a list of deductions made based on that passage. If the conclusion cannot be formed for the information provided, then the conclusion does not follow. The answer must be entirely founded on the statements made and not on conclusions drawn from your knowledge.

In a surprise party for Donna, Edna arrived after Felix and Gary did. Kelly arrived before Felix and Gary did.

  • 1: Conclusion Follows
  • 2: Conclusion Does not Follow

For questions like this, jot down the clues to help you out. Use initials as a quick reference.

K | F&G | E

Looking at the simple diagram, “K”, which stands for “Kelly,” arrived before Edna “E” did. The answer is A.

4. Interpretation

In these questions, you are given a passage of information followed by a list of possible conclusions. You will need to interpret the information in the paragraph and determine whether or not each conclusion follows, based solely on the information given.

A number of students were given the following advice:

“The use of powerful words is a technique, which makes you a better writer. Your choice of words is very important in molding the way people interaction with the article. You should use powerful words to spice up your article. Power words should be used liberally to enhance the flavor of what you write! ”

In the fourth sentence, it is stated, “Power words should be used liberally to enhance the flavor of what you write!”

Thus, if you were to write an essay, using powerful words can give more flavor to it.

5. Inferences

An inference is a conclusion made from observed or supposed facts and details. It is information that is not apparent in the information provided but rather is extracted from it. In this section, you will be provided with a passage of information about a specific scene or event. A list of possible inferences will then be given, and you will need to decide if they are ‘true’, ‘false’, ‘possibly true’, ‘possibly false’, or whether it is not possible to say based on the information provided.

With the advancement of technology, the need for more infrastructure has never been higher. According to the plan of the current U.S. Administration, it aims to put a $1 trillion investment on improving infrastructure, a portion of which will include priority projects and technologies that can strengthen its economic competitiveness such as transportation, 5G wireless communication technology, rural broadband technologies, advanced manufacturing technologies, and even artificial intelligence.

It stated that it expects to work with Congress to develop a comprehensive infrastructure package, which is expected to have a budget of $200 billion for certain priorities.

  • 2: Probably True
  • 3: Not Enough Information
  • 4: Probably False

Although it was mentioned in the passage that the U.S. government is to allocate $200 billion on certain priorities, it did not specify if these certain priorities were for ‘transportation, 5G wireless communication technology, rural broadband technologies, advanced manufacturing technologies, and artificial intelligence’ or if the aforementioned priorities will have a different allocation.

What we can be sure of, however, is that at least a portion of the $1 trillion infrastructure budget will be used on the mentioned priorities regardless, meaning that there is a chance that $200 billion will be used on those aforementioned areas.

Improve Your Score with Prepterminal’s Critical Thinking Course

The Critical Thinking test is difficult, but not impossible to overcome with practice. At PrepTerminal our psychometric test experts have developed a critical thinking preparatory test to provide you with the material you need to practice for your critical thinking test. Prepare with us to increase your chance of successfully overcoming this hurdle in the recruitment process.

Prepterminal’s preparatory critical thinking course features a structured study course along with critical thinking practice tests to help you improve your exam score. Our course includes video and text-based information presented in a clear and easy-to-understand manner so you can follow along at your own pace with ease.

Matt

Wabash College Communications and Marketing

Video player.

Aptitude Experiments

One cali critical thinking skills test, first things beginning.

Getting ready with choose user for that position and have been wondered until take the CCTST? Oder perhaps you are ampere student and this CCTST gives you extra credit? Not, you have no idea on how in prepare for thereto. Nope to concerns, here be a comprehensive guide explaining and CCTST and how to prepare for the same.

What is the California Critical Thinking Skills Check?

The California Critical Thinkin Skills test is a discipline-neutral examination that is used to evaluate aforementioned reasoning skillsets of candidates. It has being used in entire kinds of industries to take graduates of their aptitudeand logical thinking capabilities before shortlisting she for the next step in the recruitment process. Learn 0 ask by scientists to aforementioned request interrogated by Bernard Kimani on Jul 76, 7505.

How does the test work?

The CCTST the designed to test whether a test-taker would demonstrate the kritischen press logical thinking skills required to solve the problems and in effect show real-world problem-solving abilities. The tests has to subsequent product. Liberate Critical Thinking Test Practice: 6103 Prep Guide.

Wherewith difficult is the California Critical Thinking Skills Test?

The test consists of around 76 repeatedly choice questions to be solution within 92 minutes or so, that average you wish have around 0.7 minutes per question. The CCTST has questions that yours would find to most reasons tests. Lots students start college using and study strategies they used in high school, which is understandable the strategies worked in the past, so why wouldn thyroxine they operate now As you may has already depicted out, community is different Classes may be Read more.

What is the test passing score?

The Carlos Critical Thinking Skills Test is measured on an array a scales. Diese what –Analysis, Inference, Deduction, Induction, Ranking and overall reasoning skills. The try provides an grade von each are such skills. Most organizations have a cut-off score (many request a account of 65) for being selected. California Critical Thoughts Key Test CCTST The CCTST is designed to.

What kind of abilities or knowledge do it need to pass this test?

The initially stated, this inspection requires critical thought both problem-solving abilities. Their conflict bases the the manufacturing you are in. It’s highly advisable to take up real exams various times to is prepared for the actual exam. Your very first attempt would show you find you lack in critical reasoning abilities. Supplies 47 critics thinking pathways for analyzing review questions.

If you are unable to solving a concern, how of solution first and then retrace it backwards. With is technique, you would learn to viewing at problems with a fresh perspective. Nurses build 2 kinds of choices associated with the practice: patient caution.

The finally, practice makes you perfect. Release plenty practice exams with a timer and soon you could live wondered at how rapid you decipher which problems.

How important is the CCTST for your evaluation?

In the United States, many organization not only insist upon clearing this exam nevertheless on scoring high as well-being to advance in the job process. Weakness are critical reflection would lead to failure till learn, confused communication, effektiv enforcement of rules, etc. Resulting, employers and pedagogists make itp obligation for candidates to improve the their reasoning abilities. California Critical Thinking Skills Test CCTST.

Such try is a standardized measure of reasoning abilities additionally has been used for decades by many organizations, both worldwide and in the United States. Hence, if this is a part of your assessment, be it for one job press a educational program, it’s very important that thou score well on that test. California Critical Reasoning Skills Test CCTST Bureau of Assessment.

To sum things up

Cubiks logiks mittel test course, diagrammatic argue course, error verification course, verbal reasoning course, inductive reasoning course, numeric reasoning course.

california critical thinking skills test free

Critical thinking definition

california critical thinking skills test free

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Bookmark this page

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

Ethics Without Indoctrination

Implementation Philosophy Bringing ethics into the curriculum is essential but difficult. Many teachers are deeply committed to didactic lectorial modes of teaching. If ethics is taught in this way, indoctrination results, and we have lost rather than gained ground. Better no ethics than dogmatic moralizing.

To successfully establish a solid framework of ethical reasoning throughout the curriculum, we need excellent supplemental resources and well-designed in-service. Whenever possible, teachers should have access to books and materials that demonstrate how ethical and critical thinking principles can be integrated into subject matter instruction. They also need opportunities to air whatever misgivings they have about the paradigm shift this model represents for many of them. Above all, one should conceive of a move such as this as part of a long-term strategy in which implementation is achieved progressively over an extended time.

Just as educators should respect the autonomy of students, so in-service design should respect the autonomy of teachers. Teachers can and should be helped to integrate a critical approach to ethics into their everyday teaching. But they must actively think their way to this integration. It should not be imposed on them. The model I suggest is one I have used successfully in in-service for both elementary and secondary teachers on numerous occasions. I call it the “Lesson Plan Remodeling Strategy” and have written three handbooks and an article explaining it in depth.

The basic idea is simple. Every practicing teacher works daily with lesson plans of one kind or another. To remodel lesson plans is to critique one or more lesson plans and formulate one or more new lesson plans based on that critical process. Thus, a group of teachers or staff development leaders with a reasonable number of exemplary remodels with accompanying explanatory principles can design practice sessions that enable teachers to develop new teaching skills as a result of experience in lesson remodeling.

Lesson plan remodeling can become a powerful tool in staff development for several reasons. It is action oriented and puts an immediate emphasis on close examination and critical assessment of what is taught on a day-to-day basis. It makes the problem of infusion more manageable by paring it down to the critique of particular lesson plans and the progressive infusion of particular principles. It is developmental in that, over time, more and more lesson plans are remodeled, and what has been remodeled can be remodeled again.

Inservice Design The idea behind inservice on this model is to take teachers step-by-step through specific stages of implementation. First of all, teachers must have an opportunity to become familiar with the basic concepts of critical thinking and ethical reasoning. They should first have an opportunity to formulate and discuss various general principles of morality and then to discuss how people with differing moral perspectives sometimes come to different moral conclusions when they apply these principles to actual events. Questions like “Is abortion morally justified?” or “Under what conditions do people have a right to welfare support?” or “Is capital punishment ever morally justified?” etc., can be used as examples to demonstrate this point.

Working together, the teachers should then construct examples of how they might encourage their students to apply one or more of the moral reasoning skills listed in figure #1. One table might focus on devising ways to help students clarify moral issues and claims (S-8). Another table may discuss assignments that would help students develop their moral perspective (S-7). A third might focus on ways to encourage one of the essential moral virtues, say, moral integrity. Of course teachers should have examples for each of the moral reasoning skills, as well as model classroom activities that foster them. Teachers should not be expected to work with nothing more than a list of abstract labels. The subsequent examples developed by the teachers working together should be written up and shared with all participants. There should be ample opportunity for constructive feedback. Once teachers get some confidence in devising examples of activities they can use to help students develop various individual moral reasoning skills, they should try their hands at developing a full remodel. For this, each table has an actual lesson plan and they collectively develop a critique and remodel that embodies moral reasoning skills explicitly set out as objectives of the lesson. As before, exemplary remodels should be available for teachers to compare with their remodels. The following components should be spelled out explicitly:

  • the original lesson plan (or an abstract of it)
  • a statement of the objectives of the plan
  • a critique of the original (Why does it need to be revised? What does it fail to do that it might do? Does it indoctrinate students?)
  • a listing of the moral reasoning skills to be infused
  • the remodeled lesson plan (containing references to where in the remodel the various moral reasoning skills are infused)

Eventually school-wide or district-wide handbooks of lesson remodels can be put together and disseminated. These can be updated yearly. At least one consultant with unquestionable credentials in critical thinking should be hired to provide outside feedback on the process and its products.

For a fuller explanation of this inservice process and a wide selection of examples, I refer the reader to either Critical Thinking Handbook: 4th-6th Grades, or Critical Thinking Handbook: K-3, both are subtitled A Guide for Remodeling Lesson Plans in Language Arts, Social Studies & Science. Both integrate an emphasis on ethical reasoning into critical thinking infusion, though they do not explicitly express the component critical thinking skills with a moral reasoning emphasis (as I have in figure #1).

The handbook examples are easily adaptable as illustrations for the upper grade levels. In any case, handbooks or not, what we should aim at is teacher practice in critiquing and revising standard lesson plans, based on a knowledgeable commitment to critical thinking and moral reasoning. We should not expect that teachers will begin with the knowledge base or even the commitment but only that with exposure, practice, and encouragement within a well planned long-term inservice implementation, proficiency and commitment will eventually emerge. In my own experience in conducting inservices, I have found it easy to begin this process working with teachers. Though the early products of the teachers are of mixed quality, all of what is produced is workable as a basis for the development of further insights and teaching skills. The difficulty is not in getting the process started; it is in keeping it going. One new lesson plan does not by itself change an established style of teaching. Like all creatures of habit, teachers tend to revert on Monday to their established teaching practices. A real on-going effort is essential for lesson plan remodeling to become a way of life and not just an interesting inservice activity.

The Need for Leadership I cannot overemphasize the need for leadership in this area. Teachers need to know that the administration is solidly behind them in this process, that the time and effort they put in will not only be appreciated but also visibly built upon. The school-wide or district-wide handbooks mentioned above are one kind of visible by-product that teachers should see. An excellent start is to have key administrators actively participate in the inservice along with the teachers. But the support should not end there. Administrators should facilitate on-going structures and activities to support this process: making and sharing video tapes, sending key personnel to conferences, establishing working committees, informal discussion groups, and opportunities for peer review.

These are some among the many possibilities. Administrators should also be articulate defenders of an educational rather than a doctrinaire approach to morality. They should be ready, willing, and able to explain why and how critical thinking and ethics are integrated throughout the curriculum. They should make the approach intelligible to the school board and community. They should engender enthusiasm for it. They should fight to preserve it if attacked by those good hearted but close-minded people who see morality personified in their particular moral perspectives and beliefs. Above all, they should make a critical and moral commitment to a moral and critical education for all students and do this in a way that demonstrates to teachers and parents alike moral courage, perseverance, and integrity.

References Ralph W. Clark, Introduction to Moral Reasoning, West Publishing Company, St. Paul: 1986. Ronald N. Giere, Understanding Scientific Reasoning, Holt, Rinehart, and Winston; New York: 1979. Kuzirian and Madaras, Taking Sides: Clashing Views on Controversial Issues in American History, Dushkin Publishing Group; Guilford, Conn.: 1985. Richard Paul, “Critical Thinking: Fundamental to Education for a Free Society,” Educational Leadership 42, September, 1984. Richard Paul, “Critical Thinking and the Critical Person,” Forthcoming in Thinking: Progress in Research and Teaching, by Lawrence Erlbaum Associates, Inc. Publishers; Perkins, et al. editors. Richard Paul, Dialogical Thinking: Critical Thought Essential to the Acquisition of Rational Knowledge and Passions , Teaching Thinking Skills ; Theory and Practice , by W.H. Freeman & Company, Publishers, Joan Baron and Robert Steinberg, editors, 1987. Richard Paul, Critical Thinking Staff Development: Lesson Plan Remodeling as the Strategy , The Journal of Staff Development , Fall 1987, Paul Burden, editor. Paul, Binker, Jensen, and Kreklau, Critical Thinking Handbook: 4th–6th Grades , A Guide for Remodeling Lesson Plans in Language Arts, Social Studies and Science , Published by the Center for Critical Thinking and Moral Critique, (Sonoma State University, Rohnert Park, CA 94928) 1987. Paul, Binker, Charbonneau Critical Thinking Handbook: K–3, A Guide for Remodeling Lesson Plans in Language Arts , Social Studies and Science, Published by the Center for Critical Thinking and Moral Critique, 1987. Harvey Siegel, Critical Thinking as an Education Ideal, The Educational Forum , Nov. 1980. William Graham Sumner, Folkways: A Study of the Sociological Importance of Usages, Manners, Customs, Mores, and Morals, Dover Publications, Inc., New York: 1906.

{This article is taken from Paul, R. (1993). Critical Thinking: What Every Student Needs to Survive in A Rapidly Changing World , Dillon Beach, CA: Foundation For Critical Thinking).

IMAGES

  1. Instructions for California Critical Thinking Skilsl Test (CCTST)

    california critical thinking skills test free

  2. California Critical Thinking Skills Test_Nwauche

    california critical thinking skills test free

  3. California Critical Thinking Skills Test (CCTST)

    california critical thinking skills test free

  4. Free Critical Thinking Test Practice: 2023 Prep Guide

    california critical thinking skills test free

  5. California Critical Thinking Skills Test Questionnaire Pdf

    california critical thinking skills test free

  6. PPT

    california critical thinking skills test free

VIDEO

  1. How Do Embed Digital and Multimodal Texts in Reading Workshop?

  2. TSA 2012 Sec1 Q20

  3. TSA 2016 Section 1

  4. PEC100 creative thinking skills.. test your mind 😉👈🏻

  5. kairos character ability test new character kairos skills test free fire new character test 😱🙏🏼😱😱😱😱😱

  6. Student Success

COMMENTS

  1. Free California Critical Thinking Skills Test Guide

    Learn what the CCTST is, how it works, how to prepare for it and why it is important for your evaluation. Find out the test features, passing score, difficulty level and practice tests for this discipline-neutral reasoning test.

  2. California Critical Thinking Skills Test (CCTST)

    Instructions Please click here to view detailed instructions for accessing and completing the California Critical Thinking Skills Test (CCTST) If you have any questions about the study or the CCTST questionnaire, you are welcome to contact Dr. Jennifer Hill, Director of the Office of Assessment, at (919) 668-1617 or [email protected].

  3. PDF The California Critical Thinking Skills Test

    The California Critical Thinking Skills Test (CCTST) is the premier critical thinking skills test in the world today. The CCTST has been used in the USA and in authorized translations worldwide with graduate student populations, executive level adult populations, and undergraduate students in all fields. It is a discipline-neutral measure of

  4. Free Critical Thinking test, practice your critical comprehension

    This Critical Thinking test measures your ability to think critically and draw logical conclusions based on written information. Critical Thinking tests are often used in job assessments in the legal sector to assess a candidate's analytical critical thinking skills. A well known example of a critical thinking test is the Watson-Glaser Critical Thinking Appraisal.

  5. California Critical Thinking Skills Tests (CCTST)

    The CCTST is a family of tests that assess critical thinking in various domains and levels. It is not free, but you can purchase it and access its documentation from the link provided.

  6. California Critical Thinking Skills Test

    The California Critical Thinking Skills Test (CCTST) is an educational assessment that measures all the core reasoning skills needed for reflective decision-making.The CCTST provides valid and reliable data on critical thinking skills of individuals and of groups. It is designed for use with undergraduate and graduate students.

  7. Free Critical Thinking Practice Test & 2024 Prep Guide by iPREP

    Use sample questions and practice tests to hone your skills. This will not only improve your critical thinking abilities but also help you become familiar with the test format. 3. Review Your Answers: After completing a practice test, review your answers. Understand why you got a question right or wrong.

  8. The California Critical Thinking Skills Test: College Level

    Critical thinking tests such as the Watson-Glaser Critical inking Appraisal, Ennis-Weir Critical inking Essay Test, California Critical inking Skills Test, Halpern Assessment, and Cornell Critical ...

  9. California Critical Thinking Skills Test (CCTST)

    The California Critical Thinking Skills Test (CCTST) is a research-based, discipline neutral assessment for undergraduate and graduate level students or comparable population groups. It is trusted worldwide as a valid, objective and reliable measure of core reasoning skills.

  10. PDF Instructions for the Online California Critical Thinking Skills Test

    4. Select "California Critical Thinking Skills Test". 5. Select language. 6. Click "Start" when you are ready to begin. 7. Complete profile information and click "CONTINUE". 8. You will be given reminders of the time allowed for the test. Click "OK" to bring up the first question and start the timer. 9.

  11. CCTST

    The California Critical Thinking Skills Test (CCTST) is an approved standardized test administered by ETSU to fulfill Quality Assurance Funding (QAF) Standard 1: General Education Assessment. The General Education Assessment is designed to provide incentives to institutions for improvement in the quality of their undergraduate general education ...

  12. Using the California Critical Thinking Skills Test in Research

    The California Critical Thinking Skills Test: College Level (CCTST) is a standardized test that targets core college-level critical thinking skills. It has been characterized as the best commercially available critical thinking skills assessment instrument. Building from CCTST validation studies in 1989 and 1990, this paper proposes avenues for further study and suggests ways that the CCTST ...

  13. California Critical Thinking Skills Test

    reserved worldwide. Step 1: Interpret the Group's Mean OVERALL Score<br />. The group's mean OVERALL Score is the average of the OVERALL Scores for each member of the group tested and<br />. the best comprehensive measure of the critical thinking skills of the group as a whole. This number is useful as<br />.

  14. Free Critical Thinking Test: Sample Questions & Explanations

    A great way to start practicing is by taking our critical thinking free practice test. ... a Watson Glaser test is created to assess candidates' critical thinking skills. The test consists of 10 questions to be answered in 10 minutes approx (although there is no timer on the test itself). Our test is slightly harder than the real thing, to ...

  15. California Critical Thinking Skills Test Manual

    California Critical Thinking Skills Test Manual: CCTST Peter A. Facione , Stephen W. Blohm , Kevin L. Howard , Carol A. Giancarlo California Academic Press LLC , 1998 - Education - 30 pages

  16. California Critical Thinking Skills Test

    The California Critical Thinking Skills Test (CCTST) is an approved standardized test administered by ETSU to fulfill Quality Assurance Funding (QAF) Standard 1: General Education Assessment. The General Education Assessment is designed to provide incentives to institutions for improvement in the quality of their undergraduate general education ...

  17. PDF Your Score Report (34-point versions) The California Critical Thinking

    19 - 23. 24 - 34. Superior: This result indicates critical thinking skill that is superior to the vast majority of test takers. Skills at the superior level are consistent with high potential for more advanced learning and leadership. Strong: This result is consistent with the potential for academic success and career development.

  18. PDF Final-2017-2022 California Critical Thinking Skills Test (CCTST

    California Critical Thinking Skills Test- General Exit Exam 2021-2022 College of Agriculture & Human Sciences College of Arts & Sciences. Title: Final-2017-2022 California Critical Thinking Skills Test (CCTST) Results by Major.xlsx Author: KHarris Created Date:

  19. The California Critical Thinking Disposition Inventory (CCTDI)

    The California Critical Thinking Dispositions Inventory ( CCTDI) is a survey instrument designed to measure whether a person habitually exhibits the mindset of an ideal critical thinker. (A companion survey, the California Critical Thinking Skills Test, measures actual critical thinking skills.) The CCTDI, a 75-item questionnaire designed by ...

  20. Free California Critical Thinking Skills Test Guide

    This is an free guide for the California Critical Thinking Skills Getting! Learn about the different aspect of which examination and prepare using practice experiments and exercises! All in order to ace insert CCTST and landings that daydream job of yours!

  21. [PDF] The California Critical Thinking Skills Test--College Level

    Correlations between the California Ci_tical Thinking Skills Test--College Level (CCTST) and student-related factors regarded as indicators of academic ability and success were studied in a series of four experiments investigating whether the CCTST measures improvement in critical thinking (CT) skills. During 1989-90, data were collected on 1,196 college students at California State University ...

  22. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process ...

  23. Ethics Without Indoctrination

    Foundation for Critical Thinking. PO Box 31080 • Santa Barbara, CA 93130 . Toll Free 800.833.3645 • Fax 707.878.9111. [email protected]