Stanford University

Search form

  • Find Stories
  • For Journalists

Stanford research shows how to improve students' critical thinking about scientific evidence

Physicists at Stanford and the University of British Columbia have found that encouraging students to repeatedly make decisions about data collected during introductory lab courses improves their critical thinking skills.

Swinging pendulum

Students who gather their own data and make their own decisions in a simple pendulum experiment gain critical thinking skills that are useful in later physics courses, according to research at Stanford and the University of British Columbia.

Introductory lab courses are ubiquitous in science education, but there has been little evidence of how or whether they contribute to learning. They are often seen as primarily “cookbook” exercises in which students simply follow instructions to confirm results given in their textbooks, while learning little.

In a study published today in the Proceedings of the National Academy of Sciences , scientists from Stanford and the University of British Columbia show that guiding students to autonomous, iterative decision-making while carrying out common physics lab course experiments can significantly improve students’ critical thinking skills.

In the multi-year, ongoing study, the researchers followed first-year students in co-author Douglas Bonn’s introductory physics lab course at the University of British Columbia. They first established what students were, and were not, learning following the conventional instructional approach, and then systematically modified the instructions of some lab experiments to change how students think about data and their implications.

One of the first experiments the researchers tackled involved swinging a pendulum and using a stopwatch to time the period between two angles of amplitude. Students conducting the traditional experiment would collect the data, compare them to the equation in the textbook, chalk up any discrepancies to mistakes and move along.

In the modified course, the students were instructed to make decisions based on the comparison. First, what should they do to improve the quality of their data, and then, how could they better test or explain the comparison between data and the textbook result? These are basic steps in all scientific research.

Students chose improvements such as conducting more trials to reduce standard error, marking the floor to be more precise in measuring the angle, or simply putting the team member with the best trigger finger in charge of the stopwatch.

As their data improved, so did their understanding of the processes at work, as well as their confidence in their information and its ability to test predicted results.

“By actually taking good data, they can reveal that there’s this approximation in the equation that they learn in the text book, and they learn new physics by this process,” said Natasha Holmes, the lead author on the study, who began the research as a doctoral candidate at UBC and is building upon it as a postdoctoral research fellow at Stanford.

“By iterating, making changes and learning about experimental design in a more deliberate way, they come out with a richer experience.”

Researchers found that students taking an iterative decision-making approach to the experiment were 12 times more likely to think of and employ ways to improve their data than the students with the traditional instruction. Similarly, the experimental group was four times more likely to identify and explain the limits of their predictive model based on their data.

Even more encouraging, these students were still applying these same critical thinking skills a year later in another physics course.

“This is sort of a radical way to think about teaching, having students practice the thinking skills you want them to develop, but in another way it’s obvious common sense,” said co-author Carl Wieman , a professor of physics and of education at Stanford. “Natasha has shown here how powerful that approach can be.”

The ability to make decisions based on data is becoming increasingly important in public policy decisions, Wieman said, and understanding that any real data have a degree of uncertainty, and knowing how to arrive at meaningful conclusions in the face of that uncertainty, is essential. The iterative teaching method better prepares students for that reality.

“Students leave this class with fundamentally different ideas about interpretation of data and testing against model predictions, whether it’s about climate change or vaccine safety or swinging pendulums,” Wieman said.

At Stanford, Holmes is expanding her research, applying these lessons to a range of undergraduate courses at different levels and subjects.

If iterative design can get first-year students to employ expert-like behaviors, the gains could be greater in advanced courses, she said. When students embark on an independent project, for instance, they’ll be much better prepared to face and clear any hurdles.

“Students tell me that it helped them learn what it means to do science, and helped to see themselves as scientists and critical thinkers,” Holmes said. “I think it’s done a whole lot for their motivation and attitudes and beliefs about what they’re capable of. So at least from that perspective, I think experiment design that encourages iterative thinking will have huge benefits for students in the long run.”

Programs by Topic  

  • Corporate Governance
  • Design Thinking
  • Entrepreneurship
  • General Management
  • Negotiation
  • Organizational Leadership
  • Personal Leadership
  • Social Impact
  • Technology & Operations

Featured Programs  

  • Stanford Executive Program
  • Stanford LEAD
  • See All Programs by Date
  • See All Online Programs
  • Custom Programs
  • Program Formats
  • Developing a Program
  • Catalyst Programs
  • Diversity & Inclusion for Strategic Impact
  • Strategic Transformation in Times of Disruption
  • Program Experience
  • Contact Client Services
  • Academic Experience

Life & Learning  

  • In-Person Programs
  • On-Demand Online Courses
  • Live Online Programs

Community  

  • Faculty Spotlights
  • Participant Spotlights
  • Alumni Voices
  • Eligibility
  • Payment & Cancellation
  • Application Process
  • International Participants
  • Apply Online

participants in the Stanford LEAD Program

Stanford LEAD Online Business Program

Choose a session:, 11 sep 2024 – 10 sep 2025, critical analytical thinking.

Critical Analytical Thinking is essentially the language of strategy. It adds structure and transparency to the analysis and formulation of strategy and helps executives make decisions in a collaborative, logical, and fact-driven fashion.

Course Introduction

This course will help you develop and hone skills necessary to analyze complex problems, formulate well-reasoned arguments, and consider alternative points of view. It will help you assess innovative business models, identify critical issues, develop and present well-reasoned positions, and evaluate evidence. You will apply those skills to address a variety of management problems in both this and subsequent courses in the LEAD Certificate program.

  • Foundations of logical reasoning
  • Using and interpreting evidence
  • Designing experiments
  • Using analogies

We will use a combination of lectures and case studies to prepare you to present written and video arguments for your positions, and to critique and debate those of your peers.

Course Faculty

Haim mendelson.

Program dates, fees, and faculty subject to change. Consistent with its non-discrimination policy, Stanford’s programs are open to participants regardless of race, color, national or ethnic origin, sex, age, disability, religion, sexual orientation, gender identity or expression, veteran status, marital status or any other characteristic protected by applicable law.

  • See the Current DEI Report
  • Supporting Data
  • Research & Insights
  • Share Your Thoughts
  • Search Fund Primer
  • Teaching & Curriculum
  • Affiliated Faculty
  • Faculty Advisors
  • Louis W. Foster Resource Center
  • Defining Social Innovation
  • Impact Compass
  • Global Health Innovation Insights
  • Faculty Affiliates
  • Student Awards & Certificates
  • Changemakers
  • Dean Jonathan Levin
  • Dean Garth Saloner
  • Dean Robert Joss
  • Dean Michael Spence
  • Dean Robert Jaedicke
  • Dean Rene McPherson
  • Dean Arjay Miller
  • Dean Ernest Arbuckle
  • Dean Jacob Hugh Jackson
  • Dean Willard Hotchkiss
  • Faculty in Memoriam
  • Stanford GSB Firsts
  • Certificate & Award Recipients
  • Teaching Approach
  • Analysis and Measurement of Impact
  • The Corporate Entrepreneur: Startup in a Grown-Up Enterprise
  • Data-Driven Impact
  • Designing Experiments for Impact
  • Digital Business Transformation
  • The Founder’s Right Hand
  • Marketing for Measurable Change
  • Product Management
  • Public Policy Lab: Financial Challenges Facing US Cities
  • Public Policy Lab: Homelessness in California
  • Lab Features
  • Curricular Integration
  • View From The Top
  • Formation of New Ventures
  • Managing Growing Enterprises
  • Startup Garage
  • Explore Beyond the Classroom
  • Stanford Venture Studio
  • Summer Program
  • Workshops & Events
  • The Five Lenses of Entrepreneurship
  • Leadership Labs
  • Executive Challenge
  • Arbuckle Leadership Fellows Program
  • Selection Process
  • Training Schedule
  • Time Commitment
  • Learning Expectations
  • Post-Training Opportunities
  • Who Should Apply
  • Introductory T-Groups
  • Leadership for Society Program
  • Certificate
  • 2023 Awardees
  • 2022 Awardees
  • 2021 Awardees
  • 2020 Awardees
  • 2019 Awardees
  • 2018 Awardees
  • Social Management Immersion Fund
  • Stanford Impact Founder Fellowships and Prizes
  • Stanford Impact Leader Prizes
  • Social Entrepreneurship
  • Stanford GSB Impact Fund
  • Economic Development
  • Energy & Environment
  • Communication
  • Collaborative Environment
  • Activities & Organizations
  • Student Services
  • Stanford GSB Residences
  • International Students
  • Environmental Leadership
  • Stanford GSB Artwork
  • A Closer Look
  • California & the Bay Area
  • Voices of Stanford GSB
  • Business & Beneficial Technology
  • Business & Sustainability
  • Business & Free Markets
  • Business, Government, and Society Forum
  • Get Involved
  • Why Stanford MBA
  • Second Year
  • Global Experiences
  • JD/MBA Joint Degree
  • MA Education/MBA Joint Degree
  • MD/MBA Dual Degree
  • MPP/MBA Joint Degree
  • MS Computer Science/MBA Joint Degree
  • MS Electrical Engineering/MBA Joint Degree
  • MS Environment and Resources (E-IPER)/MBA Joint Degree
  • Academic Calendar
  • Clubs & Activities
  • Conferences
  • LGBTQ+ Students
  • Military Veterans
  • Minorities & People of Color
  • Partners & Families
  • Students with Disabilities
  • Student Support
  • Residential Life
  • Student Voices
  • MBA Alumni Voices
  • A Week in the Life
  • Career Support
  • Employment Outcomes
  • Cost of Attendance
  • Knight-Hennessy Scholars Program
  • Yellow Ribbon Program
  • BOLD Fellows Fund
  • Loan Forgiveness
  • Contact the Financial Aid Office
  • Evaluation Criteria
  • GMAT & GRE
  • English Language Proficiency
  • Personal Information, Activities & Awards
  • Professional Experience
  • Letters of Recommendation
  • Optional Short Answer Questions
  • Application Fee
  • Reapplication
  • Deferred Enrollment
  • Joint & Dual Degrees
  • Entering Class Profile
  • Event Schedule
  • Ambassadors
  • New & Noteworthy
  • Ask a Question
  • See Why Stanford MSx
  • Is MSx Right for You?
  • MSx Stories
  • Leadership Development
  • Career Advancement
  • Career Change
  • How You Will Learn
  • Admission Events
  • Personal Information
  • Information for Recommenders
  • GMAT, GRE & EA
  • English Proficiency Tests
  • After You’re Admitted
  • Daycare, Schools & Camps
  • U.S. Citizens and Permanent Residents
  • Requirements
  • Requirements: Behavioral
  • Requirements: Quantitative
  • Requirements: Macro
  • Requirements: Micro
  • Annual Evaluations
  • Field Examination
  • Research Activities
  • Research Papers
  • Dissertation
  • Oral Examination
  • Current Students
  • Research Resources
  • Education & CV
  • International Applicants
  • Statement of Purpose
  • Reapplicants
  • Application Fee Waiver
  • Deadline & Decisions
  • Financial Aid
  • Job Market Candidates
  • Academic Placements
  • Stay in Touch
  • Faculty Mentors
  • Current Fellows
  • Standard Track
  • Fellowship & Benefits
  • Group Enrollment
  • Diversity & Inclusion
  • Strategic Transformation
  • Campus Experience
  • Live Online Experience
  • Silicon Valley & Bay Area
  • Digital Credentials
  • Stanford Ignite
  • Frequently Asked Questions
  • Operations, Information & Technology
  • Organizational Behavior
  • Political Economy
  • Awards & Honors
  • Classical Liberalism
  • The Eddie Lunch
  • Accounting Summer Camp
  • Videos, Code & Data
  • California Econometrics Conference
  • California Quantitative Marketing PhD Conference
  • California School Conference
  • China India Insights Conference
  • Homo economicus, Evolving
  • Political Economics (2023–24)
  • Scaling Geologic Storage of CO2 (2023–24)
  • A Resilient Pacific: Building Connections, Envisioning Solutions
  • Adaptation and Innovation
  • Changing Climate
  • Civil Society
  • Climate Impact Summit
  • Climate Science
  • Corporate Carbon Disclosures
  • Earth’s Seafloor
  • Environmental Justice
  • Operations and Information Technology
  • Organizations
  • Sustainability Reporting and Control
  • Taking the Pulse of the Planet
  • Urban Infrastructure
  • Watershed Restoration
  • Junior Faculty Workshop on Financial Regulation and Banking
  • Ken Singleton Celebration
  • Marketing Camp
  • Quantitative Marketing PhD Alumni Conference
  • Presentations
  • Theory and Inference in Accounting Research
  • Publications
  • Working Papers
  • Case Studies
  • Cities, Housing & Society Lab
  • Stanford Closer Look Series
  • Quick Guides
  • Core Concepts
  • Journal Articles
  • Glossary of Terms
  • Executive Education
  • Faculty & Staff
  • Corporations and Society Initiative
  • Researchers & Students
  • Research Approach
  • Charitable Giving
  • Financial Health
  • Government Services
  • Workers & Careers
  • Short Course
  • Adaptive & Iterative Experimentation
  • Incentive Design
  • Social Sciences & Behavioral Nudges
  • Bandit Experiment Application
  • Conferences & Events
  • Reading Materials
  • Energy Entrepreneurship
  • Faculty & Affiliates
  • SOLE Report
  • Responsible Supply Chains
  • Current Study Usage
  • Pre-Registration Information
  • Participate in a Study
  • Data, Analytics & Research Computing
  • Founding Donors
  • Location Information
  • Participant Profile
  • Network Membership
  • Program Impact
  • Collaborators
  • Entrepreneur Profiles
  • Company Spotlights
  • Seed Transformation Network
  • Responsibilities
  • Current Coaches
  • How to Apply
  • Meet the Consultants
  • Meet the Interns
  • Intern Profiles
  • Collaborate
  • Research Library
  • News & Insights
  • Program Contacts
  • Databases & Datasets
  • Research Guides
  • Consultations
  • Research Workshops
  • Career Research
  • Research Data Services
  • Course Reserves
  • Course Research Guides
  • Material Loan Periods
  • Fines & Other Charges
  • Document Delivery
  • Interlibrary Loan
  • Equipment Checkout
  • Print & Scan
  • MBA & MSx Students
  • PhD Students
  • Other Stanford Students
  • Faculty Assistants
  • Research Assistants
  • Stanford GSB Alumni
  • Telling Our Story
  • Staff Directory
  • Site Registration
  • Alumni Directory
  • Alumni Email
  • Privacy Settings & My Profile
  • Digital Communities & Tools
  • Success Stories
  • The Story of Circles
  • Support Women’s Circles
  • Stanford Women on Boards Initiative
  • Alumnae Spotlights
  • Insights & Research
  • Regional Chapters
  • Identity Chapters
  • Industry & Professional
  • Entrepreneurial Commitment Group
  • Recent Alumni
  • Class Notes
  • Half-Century Club
  • Fall Reunions
  • Spring Reunions
  • MBA 25th Reunion
  • Half-Century Club Reunion
  • Faculty Lectures
  • Ernest C. Arbuckle Award
  • Alison Elliott Exceptional Achievement Award
  • ENCORE Award
  • Excellence in Leadership Award
  • John W. Gardner Volunteer Leadership Award
  • Robert K. Jaedicke Faculty Award
  • Jack McDonald Military Service Appreciation Award
  • Jerry I. Porras Latino Leadership Award
  • Tapestry Award
  • Student & Alumni Events
  • Executive Recruiters
  • Interviewing
  • Land the Perfect Job with LinkedIn
  • Negotiating
  • Elevator Pitch
  • Email Best Practices
  • Resumes & Cover Letters
  • Self-Assessment
  • Whitney Birdwell Ball
  • Margaret Brooks
  • Bryn Panee Burkhart
  • Margaret Chan
  • Ricki Frankel
  • Peter Gandolfo
  • Cindy W. Greig
  • Natalie Guillen
  • Carly Janson
  • Sloan Klein
  • Sherri Appel Lassila
  • Stuart Meyer
  • Tanisha Parrish
  • Virginia Roberson
  • Philippe Taieb
  • Michael Takagawa
  • Terra Winston
  • Johanna Wise
  • Debbie Wolter
  • Rebecca Zucker
  • Complimentary Coaching
  • Changing Careers
  • Work-Life Integration
  • Career Breaks
  • Flexible Work
  • Encore Careers
  • Career Video Library
  • D&B Hoovers
  • Data Axle (ReferenceUSA)
  • EBSCO Business Source
  • Global Newsstream
  • Market Share Reporter
  • ProQuest One Business
  • Student Clubs
  • Entrepreneurial Students
  • Stanford GSB Trust
  • Alumni Community
  • How to Volunteer
  • Springboard Sessions
  • Consulting Projects
  • 2020 – 2029
  • 2010 – 2019
  • 2000 – 2009
  • 1990 – 1999
  • 1980 – 1989
  • 1970 – 1979
  • 1960 – 1969
  • 1950 – 1959
  • 1940 – 1949
  • Service Areas
  • ACT History
  • ACT Awards Celebration
  • ACT Governance Structure
  • Building Leadership for ACT
  • Individual Leadership Positions
  • Leadership Role Overview
  • Purpose of the ACT Management Board
  • Contact ACT
  • Business & Nonprofit Communities
  • Reunion Volunteers
  • Ways to Give
  • Fiscal Year Report
  • Business School Fund Leadership Council
  • Planned Giving Options
  • Planned Giving Benefits
  • Planned Gifts and Reunions
  • Legacy Partners
  • Giving News & Stories
  • Giving Deadlines
  • Development Staff
  • Submit Class Notes
  • Class Secretaries
  • Board of Directors
  • Career & Success
  • Culture & Society
  • Finance & Investing
  • Health Care
  • Leadership & Management
  • Operations & Logistics
  • Opinion & Analysis
  • Opportunity & Access
  • Sustainability
  • Class Takeaways
  • All Else Equal: Making Better Decisions
  • If/Then: Business, Leadership, Society
  • Grit & Growth
  • Think Fast, Talk Smart
  • Spring 2022
  • Spring 2021
  • Autumn 2020
  • Summer 2020
  • Winter 2020
  • In the Media
  • For Journalists
  • DCI Fellows
  • Other Auditors
  • Academic Calendar & Deadlines
  • Course Materials
  • Entrepreneurial Resources
  • Campus Drive Grove
  • Campus Drive Lawn
  • CEMEX Auditorium
  • King Community Court
  • Seawell Family Boardroom
  • Stanford GSB Bowl
  • Stanford Investors Common
  • Town Square
  • Vidalakis Courtyard
  • Vidalakis Dining Hall
  • Catering Services
  • Policies & Guidelines
  • Reservations
  • Contact Faculty Recruiting
  • Lecturer Positions
  • Postdoctoral Positions
  • Accommodations
  • CMC-Managed Interviews
  • Recruiter-Managed Interviews
  • Virtual Interviews
  • Campus & Virtual
  • Search for Candidates
  • Think Globally
  • Recruiting Calendar
  • Recruiting Policies
  • Full-Time Employment
  • Summer Employment
  • Entrepreneurial Summer Program
  • Global Management Immersion Experience
  • Social-Purpose Summer Internships
  • Process Overview
  • Project Types
  • Client Eligibility Criteria
  • Client Screening
  • ACT Leadership
  • Social Innovation & Nonprofit Management Resources
  • Develop Your Organization’s Talent
  • Centers & Initiatives
  • Student Fellowships

(Stanford users can avoid this Captcha by logging in.)

  • Send to text email RefWorks EndNote printer

Critical thinking

Available online.

  • EBSCO Academic Comprehensive Collection

More options

  • Find it at other libraries via WorldCat
  • Contributors

Description

Creators/contributors, contents/summary.

  • Rules for Reasoning Revisited: Toward a Scientific Conception of Critical Thinking
  • Critical Thinking & Systems Thinking
  • Developing Critical Thinking through Probability Models
  • The Promotion of Critical Thinking Skills through Argument Mapping
  • Beyond GDP?: Towards a New System of Social Accounts
  • A Four-Component Instructional Model for Teacher Training in Critical-Thinking Instruction: Its Effectiveness & Influential Factors
  • Crucial Connections: An Exploration of Critical Thinking & Scholarly Writing
  • How Can Critical Thinking be Recognized in the Classroom?
  • Collaborative Argumentation in Structured & Unstructured Chat Environments
  • Critical Thinking: Epistemological Disturbances to Improve Critical Thinking
  • (source: Nielsen Book Data)

Bibliographic information

Stanford University

  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Non-Discrimination
  • Accessibility

© Stanford University , Stanford , California 94305 .

SEP home page

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Back to Entry
  • Entry Contents
  • Entry Bibliography
  • Academic Tools
  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Supplement to Critical Thinking

How can one assess, for purposes of instruction or research, the degree to which a person possesses the dispositions, skills and knowledge of a critical thinker?

In psychometrics, assessment instruments are judged according to their validity and reliability.

Roughly speaking, an instrument is valid if it measures accurately what it purports to measure, given standard conditions. More precisely, the degree of validity is “the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests” (American Educational Research Association 2014: 11). In other words, a test is not valid or invalid in itself. Rather, validity is a property of an interpretation of a given score on a given test for a specified use. Determining the degree of validity of such an interpretation requires collection and integration of the relevant evidence, which may be based on test content, test takers’ response processes, a test’s internal structure, relationship of test scores to other variables, and consequences of the interpretation (American Educational Research Association 2014: 13–21). Criterion-related evidence consists of correlations between scores on the test and performance on another test of the same construct; its weight depends on how well supported is the assumption that the other test can be used as a criterion. Content-related evidence is evidence that the test covers the full range of abilities that it claims to test. Construct-related evidence is evidence that a correct answer reflects good performance of the kind being measured and an incorrect answer reflects poor performance.

An instrument is reliable if it consistently produces the same result, whether across different forms of the same test (parallel-forms reliability), across different items (internal consistency), across different administrations to the same person (test-retest reliability), or across ratings of the same answer by different people (inter-rater reliability). Internal consistency should be expected only if the instrument purports to measure a single undifferentiated construct, and thus should not be expected of a test that measures a suite of critical thinking dispositions or critical thinking abilities, assuming that some people are better in some of the respects measured than in others (for example, very willing to inquire but rather closed-minded). Otherwise, reliability is a necessary but not a sufficient condition of validity; a standard example of a reliable instrument that is not valid is a bathroom scale that consistently under-reports a person’s weight.

Assessing dispositions is difficult if one uses a multiple-choice format with known adverse consequences of a low score. It is pretty easy to tell what answer to the question “How open-minded are you?” will get the highest score and to give that answer, even if one knows that the answer is incorrect. If an item probes less directly for a critical thinking disposition, for example by asking how often the test taker pays close attention to views with which the test taker disagrees, the answer may differ from reality because of self-deception or simple lack of awareness of one’s personal thinking style, and its interpretation is problematic, even if factor analysis enables one to identify a distinct factor measured by a group of questions that includes this one (Ennis 1996). Nevertheless, Facione, Sánchez, and Facione (1994) used this approach to develop the California Critical Thinking Dispositions Inventory (CCTDI). They began with 225 statements expressive of a disposition towards or away from critical thinking (using the long list of dispositions in Facione 1990a), validated the statements with talk-aloud and conversational strategies in focus groups to determine whether people in the target population understood the items in the way intended, administered a pilot version of the test with 150 items, and eliminated items that failed to discriminate among test takers or were inversely correlated with overall results or added little refinement to overall scores (Facione 2000). They used item analysis and factor analysis to group the measured dispositions into seven broad constructs: open-mindedness, analyticity, cognitive maturity, truth-seeking, systematicity, inquisitiveness, and self-confidence (Facione, Sánchez, and Facione 1994). The resulting test consists of 75 agree-disagree statements and takes 20 minutes to administer. A repeated disturbing finding is that North American students taking the test tend to score low on the truth-seeking sub-scale (on which a low score results from agreeing to such statements as the following: “To get people to agree with me I would give any reason that worked”. “Everyone always argues from their own self-interest, including me”. “If there are four reasons in favor and one against, I’ll go with the four”.) Development of the CCTDI made it possible to test whether good critical thinking abilities and good critical thinking dispositions go together, in which case it might be enough to teach one without the other. Facione (2000) reports that administration of the CCTDI and the California Critical Thinking Skills Test (CCTST) to almost 8,000 post-secondary students in the United States revealed a statistically significant but weak correlation between total scores on the two tests, and also between paired sub-scores from the two tests. The implication is that both abilities and dispositions need to be taught, that one cannot expect improvement in one to bring with it improvement in the other.

A more direct way of assessing critical thinking dispositions would be to see what people do when put in a situation where the dispositions would reveal themselves. Ennis (1996) reports promising initial work with guided open-ended opportunities to give evidence of dispositions, but no standardized test seems to have emerged from this work. There are however standardized aspect-specific tests of critical thinking dispositions. The Critical Problem Solving Scale (Berman et al. 2001: 518) takes as a measure of the disposition to suspend judgment the number of distinct good aspects attributed to an option judged to be the worst among those generated by the test taker. Stanovich, West and Toplak (2011: 800–810) list tests developed by cognitive psychologists of the following dispositions: resistance to miserly information processing, resistance to myside thinking, absence of irrelevant context effects in decision-making, actively open-minded thinking, valuing reason and truth, tendency to seek information, objective reasoning style, tendency to seek consistency, sense of self-efficacy, prudent discounting of the future, self-control skills, and emotional regulation.

It is easier to measure critical thinking skills or abilities than to measure dispositions. The following eight currently available standardized tests purport to measure them: the Watson-Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), the Cornell Critical Thinking Tests Level X and Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), the Ennis-Weir Critical Thinking Essay Test (Ennis & Weir 1985), the California Critical Thinking Skills Test (Facione 1990b, 1992), the Halpern Critical Thinking Assessment (Halpern 2016), the Critical Thinking Assessment Test (Center for Assessment & Improvement of Learning 2017), the Collegiate Learning Assessment (Council for Aid to Education 2017), the HEIghten Critical Thinking Assessment (https://territorium.com/heighten/), and a suite of critical thinking assessments for different groups and purposes offered by Insight Assessment (https://www.insightassessment.com/products). The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students’ critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level certificates in critical thinking on the basis of an examination (OCR 2011). Many of these standardized tests have received scholarly evaluations at the hands of, among others, Ennis (1958), McPeck (1981), Norris and Ennis (1989), Fisher and Scriven (1997), Possin (2008, 2013a, 2013b, 2013c, 2014, 2020) and Hatcher and Possin (2021). Their evaluations provide a useful set of criteria that such tests ideally should meet, as does the description by Ennis (1984) of problems in testing for competence in critical thinking: the soundness of multiple-choice items, the clarity and soundness of instructions to test takers, the information and mental processing used in selecting an answer to a multiple-choice item, the role of background beliefs and ideological commitments in selecting an answer to a multiple-choice item, the tenability of a test’s underlying conception of critical thinking and its component abilities, the set of abilities that the test manual claims are covered by the test, the extent to which the test actually covers these abilities, the appropriateness of the weighting given to various abilities in the scoring system, the accuracy and intellectual honesty of the test manual, the interest of the test to the target population of test takers, the scope for guessing, the scope for choosing a keyed answer by being test-wise, precautions against cheating in the administration of the test, clarity and soundness of materials for training essay graders, inter-rater reliability in grading essays, and clarity and soundness of advance guidance to test takers on what is required in an essay. Rear (2019) has challenged the use of standardized tests of critical thinking as a way to measure educational outcomes, on the grounds that  they (1) fail to take into account disputes about conceptions of critical thinking, (2) are not completely valid or reliable, and (3) fail to evaluate skills used in real academic tasks. He proposes instead assessments based on discipline-specific content.

There are also aspect-specific standardized tests of critical thinking abilities. Stanovich, West and Toplak (2011: 800–810) list tests of probabilistic reasoning, insights into qualitative decision theory, knowledge of scientific reasoning, knowledge of rules of logical consistency and validity, and economic thinking. They also list instruments that probe for irrational thinking, such as superstitious thinking, belief in the superiority of intuition, over-reliance on folk wisdom and folk psychology, belief in “special” expertise, financial misconceptions, overestimation of one’s introspective powers, dysfunctional beliefs, and a notion of self that encourages egocentric processing. They regard these tests along with the previously mentioned tests of critical thinking dispositions as the building blocks for a comprehensive test of rationality, whose development (they write) may be logistically difficult and would require millions of dollars.

A superb example of assessment of an aspect of critical thinking ability is the Test on Appraising Observations (Norris & King 1983, 1985, 1990a, 1990b), which was designed for classroom administration to senior high school students. The test focuses entirely on the ability to appraise observation statements and in particular on the ability to determine in a specified context which of two statements there is more reason to believe. According to the test manual (Norris & King 1985, 1990b), a person’s score on the multiple-choice version of the test, which is the number of items that are answered correctly, can justifiably be given either a criterion-referenced or a norm-referenced interpretation.

On a criterion-referenced interpretation, those who do well on the test have a firm grasp of the principles for appraising observation statements, and those who do poorly have a weak grasp of them. This interpretation can be justified by the content of the test and the way it was developed, which incorporated a method of controlling for background beliefs articulated and defended by Norris (1985). Norris and King synthesized from judicial practice, psychological research and common-sense psychology 31 principles for appraising observation statements, in the form of empirical generalizations about tendencies, such as the principle that observation statements tend to be more believable than inferences based on them (Norris & King 1984). They constructed items in which exactly one of the 31 principles determined which of two statements was more believable. Using a carefully constructed protocol, they interviewed about 100 students who responded to these items in order to determine the thinking that led them to choose the answers they did (Norris & King 1984). In several iterations of the test, they adjusted items so that selection of the correct answer generally reflected good thinking and selection of an incorrect answer reflected poor thinking. Thus they have good evidence that good performance on the test is due to good thinking about observation statements and that poor performance is due to poor thinking about observation statements. Collectively, the 50 items on the final version of the test require application of 29 of the 31 principles for appraising observation statements, with 13 principles tested by one item, 12 by two items, three by three items, and one by four items. Thus there is comprehensive coverage of the principles for appraising observation statements. Fisher and Scriven (1997: 135–136) judge the items to be well worked and sound, with one exception. The test is clearly written at a grade 6 reading level, meaning that poor performance cannot be attributed to difficulties in reading comprehension by the intended adolescent test takers. The stories that frame the items are realistic, and are engaging enough to stimulate test takers’ interest. Thus the most plausible explanation of a given score on the test is that it reflects roughly the degree to which the test taker can apply principles for appraising observations in real situations. In other words, there is good justification of the proposed interpretation that those who do well on the test have a firm grasp of the principles for appraising observation statements and those who do poorly have a weak grasp of them.

To get norms for performance on the test, Norris and King arranged for seven groups of high school students in different types of communities and with different levels of academic ability to take the test. The test manual includes percentiles, means, and standard deviations for each of these seven groups. These norms allow teachers to compare the performance of their class on the test to that of a similar group of students.

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

IMAGES

  1. Critical Thinking PDF Summary

    critical thinking stanford university pdf

  2. (PDF) Conversations on Critical Thinking: Can Critical Thinking Find

    critical thinking stanford university pdf

  3. (PDF) Critical thought on critical thinking research

    critical thinking stanford university pdf

  4. What is critical thinking?

    critical thinking stanford university pdf

  5. Critical Thinking Skills

    critical thinking stanford university pdf

  6. Critical Thinking Books Pdf : CRITICAL THINKING DETECTIVE BOOK 1

    critical thinking stanford university pdf

VIDEO

  1. Foundations of Critical Thinking

  2. Soft skills and interpersonal communication || Previous year Solution 2022 || Bihar engg University|

  3. Soft skills and interpersonal communication || Previous year Solution 2022 || Bihar engg University|

  4. wbsu ba Programme 6th Semester PHILOSOPHY GE2 CRITICAL THINKING Suggestion 2024

  5. Teacher De-Wokefies Student By Teaching Critical Thinking

  6. Ethics|| Professional Practice law and ethics Previous year Solution 2022 || Bihar Engg University

COMMENTS

  1. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  2. Critical Thinking

    John Dewey (1910: 74, 82) introduced the term 'critical thinking' as the name of an educational goal, which he identified with a scientific attitude of mind. More commonly, he called the goal 'reflective thought', 'reflective thinking', 'reflection', or just 'thought' or 'thinking'. He describes his book as written for ...

  3. "Critical thinking"

    In reflective problem solving and thoughtful decision making using critical thinking one considers evidence, the context of judgement, the relevant criteria for making the judgement well, the applicable methods or techniques for forming the judgement, and the applicable theoretical constructs for understanding the problem and the question at ...

  4. Teaching critical thinking

    Teaching critical thinking N. G. Holmesa,1, Carl E. Wiemana,b, and D. A. Bonnc aDepartment of Physics, Stanford University, Stanford, CA 94305; bGraduate School of Education, Stanford University, Stanford, CA 94305; and cUniversity of British Columbia, Vancouver, BC, Canada Edited by Samuel C. Silverstein, College of Physicians and Surgeons, New York, NY, and accepted by the Editorial Board ...

  5. Critical Thinking > Assessment (Stanford Encyclopedia of Philosophy)

    The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students' critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level ...

  6. Critical thinking : conceptual perspectives and practical guidelines

    Publisher's summary. Dwyer's book is unique and distinctive as it presents and discusses a modern conceptualization of critical thinking - one that is commensurate with the exponential increase in the annual output of knowledge. The abilities of navigating new knowledge outputs, engaging in enquiry and constructively solving problems are not ...

  7. Critical thinking : make strategic decisions with confidence

    Summary. Provides step-by-step guidance to strengthen your reasoning skills. First you'll see the difference between critical and non-critical thinking and then learn how to: recognize different forms of deceptive reasoning; pinpoint the critical issues in any situation; categorize and evaluate types of arguments; develop and test hypotheses to ...

  8. Critical thinking : the basics in SearchWorks catalog

    Critical Thinking: The Basics is an accessible and engaging introduction to the field of critical thinking, drawing on philosophy, communication and psychology. Emphasising its relevance to decision making (in personal, professional and civic life), academic literacy and personal development, this book supports the reader in understanding and ...

  9. The Palgrave handbook of critical thinking in higher education

    Critical Thinking For Future Practice: Learning to Question; Franziska Trede and Celina McEwen 27. Critical Thinking In Osteopathic Medicine: Exploring the relationship between Critical Thinking and Clinical Reasoning; Sandra Grace and Paul J. Orrock 28. Making Critical Thinking Visible in Undergraduates' Experiences of Scientific Research

  10. PDF A Practical Guide to Critical Thinking

    1 The Nature and Value of Critical Thinking . 1.1 The Nature of Critical Thinking, 2 Exercise 1.1, 6 1.2 Critical Thinking and Knowledge, 6 Exercise 1.2, 7 1.2.1 Truth, 7 1.2.1.1 Realism, Relativism, and Nihilism, 8 1.2.1.2 Relativism and the Argument from Disagreement, 10 1.2.2 Belief 13 , 1.2.3 Justification, 15

  11. Critical Thinking: An Academic Perspective.

    Abstract. The article begins by defining "critical thinking," a multifaceted concept that requires people to think abstractly, to contextualize material into a personalized framework, and parallels a constructivist standpoint. Despite the fact that critical thinking appears to be a sophisticated intellectual construct, school systems generally ...

  12. Stanford research shows how to improve students' critical thinking

    August 17, 2015 Stanford research shows how to improve students' critical thinking about scientific evidence. Physicists at Stanford and the University of British Columbia have found that ...

  13. PDF An Introduction to Design Thinking PROCESS GUIDE

    The DeÞne mode is critical to the design process because it results in your point-of-view (POV): the explicit expression of the problem you are striving to address. More importantly, your POV deÞnes the RIGHT challenge to address, based on your new understanding of people and the problem space.

  14. Critical Analytical Thinking

    Critical Analytical Thinking is essentially the language of strategy. It adds structure and transparency to the analysis and formulation of strategy and helps executives make decisions in a collaborative, logical, and fact-driven fashion. ... Stanford's programs are open to participants regardless of race, color, national or ethnic origin ...

  15. Critical thinking in SearchWorks catalog

    In reflective problem solving and thoughtful decision making using critical thinking one considers evidence, the context of judgement, the relevant criteria for making the judgement well, the applicable methods or techniques for forming the judgement, and the applicable theoretical constructs for understanding the problem and the question at hand.

  16. PDF CONCEPTUALIZING CRITICAL THINKING IN HONG KONG ...

    reform in which critical thinking has been defined as a key skill. This paper investigates how critical thinking is conceptualized on different levels of curriculum - from intended to resourced to implemented - in the secondary school subject, Liberal Studies, against the backdrop of the New Senior Secondary Curriculum (NSSC) reform.

  17. PDF What is critical thinking

    Teaching critical thinking (2) Ennis (2013, 2018) has made a detailed proposal for a mixed approach to teaching critical thinking across the curriculum of undergraduate education. Attempts at implementing such an approach have faced difficulties. Weinstein (2013: 209-213) describes the attempt at Montclair State University in Montclair,

  18. PDF Preface: Nuclear Weapons and Critical Thinking

    In the Spring of 2010, I was invited to give one of Stanford's SLE Salons on the nuclear threat. ... and its Salons are weekly guest lectures.) As I organized that talk, I realized that SLE's emphasis on critical thinking was a perfect way to STS152, "Nuclear Weapons, Risk and Hope," Handout #1, AUT 2012-13, Page 1 of 23.

  19. Critical Thinking

    Bloom's influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

  20. PDF Language Models as Critical Thinking Tools: A Case Study of Philosophers

    Under review as a conference paper at COLM 2024. Language Models as Critical Thinking Tools: A Case Study of Philosophers. Andre Yeαγ, Jared Mooreβ, Rose Novickγ, Amy X. Zhangα. [email protected], [email protected], [email protected], [email protected]. αPaul G. Allen School of Computer Science and Engineering, University of Washington.

  21. PDF AN INTRODUCTION TO CRITICAL THINKING

    Critical thinking enables an individual to be a responsible citizen who contributes to society, and not be merely a consumer of society's distractions. Children are not born with the power to think critically, nor do they develop this ability naturally beyond survival-level thinking.

  22. Critical Thinking

    The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students' critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level ...