loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

evaluate in critical thinking

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

How to write a business case.

Getting Approval and Funding for Your Project

How to Reboot Your Career Video

Video Transcript

Add comment

Comments (1)

priyanka ghogare

evaluate in critical thinking

Try Mind Tools for FREE

Get unlimited access to all our career-boosting content and member benefits with our 7-day free trial.

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Newest Releases

Article acd2ru2

Team Briefings

Article a4vbznx

Onboarding With STEPS

Mind Tools Store

About Mind Tools Content

Discover something new today

New pain points podcast - perfectionism.

Why Am I Such a Perfectionist?

Pain Points Podcast - Building Trust

Developing and Strengthening Trust at Work

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Communicate like a leader.

Dianna Booher

Expert Interviews

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Critical Thinking and Evaluating Information

  • Introduction
  • Words Of Wisdom

What Is Critical Thinking?

Five simple strategies to sharpen your critical thinking, were critical thinking skills used in this video.

  • Critical Thinking and Reflective Judgement
  • Problem Solving Skills
  • Critical Reading
  • Critical Writing
  • Use the CRAAP Test
  • Evaluating Fake News
  • Explore Different Viewpoints
  • The Peer-Review Process
  • Critical ThinkingTutorials
  • Books on Critical Thinking
  • Explore More With These Links

Live Chat with a Librarian 24/7

Text a Librarian at 912-600-2782

If there was one life skill everyone on the planet needed, it was the ability to think with critical objectivity Henry David Throreau

Critical thinking is a complex process of deliberation that involves a wide range of skills and attitudes. It includes:

  • identifying other people's positions,  arguments and conclusions 
  • evaluating the evidence  for alternative points of view
  • weighing up the opposing arguments  and evidence fairly
  • being able to read between the lines,  seeing behind surfaces and identifying false or unfair assumptions
  • recognizing techniques  used to make certain positions more appealing than others, such as false logic and persuasive devices
  • reflecting on issues  in a structured way, bringing logic and insight to bear
  • drawing conclusions  about whether arguments are valid and justifiable, based on good evidence and sensible assumptions
  • presenting a point of view  in a structured, clear, well-reasoned way that convinces others

(Contrell, 2011)

Check Out More From This Source: 

Cover Art

A well-cultivated critical thinker:

  • raises vital questions and problems, formulating them clearly and precisely;
  • gathers and assesses relevant information, using abstract ideas to interpret it effectively  come to well-reasoned conclusions and solutions, testing them against relevant criteria and standards;
  • thinks openmindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; and
  • communicates effectively with others in figuring out solutions to complex problems.

Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism.  

(Taken from Richard Paul and Linda Elder,  The Miniature Guide to Critical Thinking Concepts and Tools,  Foundation for Critical Thinking Press, 2008)

Source: criticalthinking.org

Is the sky really blue? That might seem obvious. But sometimes things are more nuanced and complicated than you think. Here are five strategies to boost your critical thinking skills. Animated by Ana Stefaniak. Made in partnership with The Open University.

Video Source: BBC Ideas

Video Source: Dr. Bachyrycz

  • << Previous: Introduction
  • Next: Critical Thinking and Reflective Judgement >>
  • Last Updated: Mar 25, 2024 8:18 AM
  • URL: https://libguides.coastalpines.edu/thinkcritically
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, performance assessment of critical thinking: conceptualization, design, and implementation.

evaluate in critical thinking

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.

Introduction

In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

www.frontiersin.org

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank all the researchers who have participated in the iPAL program.

  • ^ https://www.ipal-rd.com/
  • ^ https://www.aacu.org/value
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: https://www.aacu.org/value (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: https://www.aacu.org/research/2018-future-of-work (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: http://fsse.indiana.edu/pdf/FSSE_IR_2019/summary_tables/FSSE19_Frequencies_(FSSE_2019).pdf (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: http://timssandpirls.bc.edu/pirls2016/international-results/ (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at: https://www.insidehighered.com/views/2009/10/16/limitations-portfolios

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: https://pdfs.semanticscholar.org/91ae/8edfac44bce3bed37d8c9091da01d6db3776.pdf .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

This article is part of the Research Topic

Assessing Information Processing and Online Reasoning as a Prerequisite for Learning in Higher Education

Argumentation, Evidence Evaluation and Critical Thinking

  • First Online: 23 November 2011
  • pp 1001–1015

Cite this chapter

evaluate in critical thinking

  • María Pilar Jiménez-Aleixandre 4 &
  • Blanca Puig 4  

Part of the book series: Springer International Handbooks of Education ((SIHE,volume 24))

12k Accesses

31 Citations

This chapter addresses the relationships between argumentation and critical thinking. The underlying questions are how argumentation supports the capacity to discriminate between claims justified by evidence and mere opinion, and how argumentation can contribute to two types of objectives related to learning science and to citizenship. First, various meanings for critical thinking in different communities are reviewed. Then, we propose our characterisation of critical thinking, which assumes that evidence evaluation is an essential component, but that there are other components related to the capacities of reflecting on the world around us and of participating in it (e.g. developing an independent opinion, including challenging the ideas of one’s own community). This characterisation draws both from the notion of commitment to evidence and from critical theorists. Using this frame, the chapter examines the contributions of argumentation in science education to the components of critical thinking, and also discusses the evaluation of evidence and the different factors influencing or even hampering it. The chapter concludes with consideration of the development of critical thinking in the science classroom.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Aikenhead, G. S. (1985). Collective decision making in the social context of science. Science Education , 69 , 453–475.

Article   Google Scholar  

Anderson, T., Howe, C., Soden, R., Halliday, J., & Low, J. (2001). Peer interaction and the learning of critical thinking skills in further education students. Instructional Science , 29 , 1–32.

Google Scholar  

Bourdieu, P., & Passeron, J.-C. (1970). La reproduction: Eléments pour une théorie du système d’enseignement . Paris: Les Éditions de Minuit (Translated as: Reproduction in education, society and culture. London: Sage, 1977).

Cooper, K., & White, R. (2007). The practical critical educator . Dordrecht, the Netherlands: Springer.

Desmond, A., & Moore, J. (1992). Darwin . London: Penguin.

Duschl, R. A., & Grandy, R. E. (2008). Reconsidering the character and role of inquiry in school science: Framing the debates. In R. A. Duschl & R. E. Grandy (Eds.), Teaching scientific inquiry: Recommendations for research and implementation (pp. 1–37). Rotterdam: Sense Publishers.

Eichinger, D. C., Anderson, C. W., Palincsar, A. S., & David, Y. M. (1991, April). An illustration of the roles of content knowledge, scientific argument, and social norms in collaborative problem solving . Paper presented at the Annual Meeting of the American Educational Research Association, Chicago.

Eirexas, F., & Jiménez-Aleixandre, M. P. (2007, August). What does sustainability mean? Critical thinking and environmental concepts in arguments about energy by 12th grade students . Paper presented at the European Science Education Research Association Conference, Malmo.

Ennis, R. H. (1987). A taxonomy of critical thinking abilities and dispositions. In J. B. Baron & R. J. Sternberg (Eds.), Teaching thinking skills: Theory and practice (pp. 9–26). New York: W. H. Freeman.

Ennis, R. H. (1992). Critical thinking: What is it? In H. A. Alexander (Ed.), Philosophy of Education 1992: Proceedings of the Forty-Eighth Annual Meeting of the Philosophy of Education Society (pp. 76–80). Urbana, IL: Philosophy of Education Society.

Erduran, S., & Jiménez-Aleixandre, M. P. (Eds.). (2008). Argumentation in science education: Perspectives from classroom-based research . Dordrecht: Springer.

Fairclough, N. (1995). Critical discourse analysis. The critical study of language . Harlow: Longman.

Freinet, C. (1969). Pour l’école du peuple . Paris: Maspero.

Freire, P. (1970). Pedagogia do oprimido . Rio de Janeiro: Paz e Terra. (Translated as Pedagogy of the oppressed , Harmondsworth: Penguin, 1972).

Gruber, H. (1981). Darwin on man: A psychological study of scientific creativity . Chicago: The University of Chicago Press.

Habermas, J. (1981–1984). The theory of communicative action . Boston: Beacon Press.

Jiménez-Aleixandre, M. P. (2008). Designing argumentation learning environments. In S. Erduran & M. P. Jiménez-Aleixandre (Eds.), Argumentation in science education: perspectives from classroom-based research (pp. 91–115). Dordrecht, the Netherlands: Springer.

Jiménez-Aleixandre, M. P., Agraso, M. F., & Eirexas, F. (2004, April). Scientific authority and empirical data in argument warrants about the Prestige oil spill . Paper presented at the Annual Meeting of the National Association for Research in Science Teaching. Vancouver.

Jiménez-Aleixandre, M. P., Bugallo Rodríguez, A., & Duschl, R. A. (2000). “Doing the lesson” or “doing science”: Argument in high school genetics. Science Education, 84 , 757–792.

Jiménez-Aleixandre, M. P., & Erduran, S. (2008). Argumentation in science education: An overview. In S. Erduran & M. P. Jiménez-Aleixandre (Eds.), Argumentation in science education: Perspectives from classroom-based research (pp. 3–27). Dordrecht, the Netherlands: Springer.

Jiménez-Aleixandre, M. P., & Federico-Agraso, M. (2009). Justification and persuasion about cloning: Arguments in Hwang’s paper and journalistic reported versions . Research in Science Education , 39, 331–347. doi 10.1007/s11165-008-9113-x.

Kelly, G. J., Druker S., & Chen, C. (1998). Students’ reasoning about electricity: Combining performance assessment with argumentation analysis. International Journal of Science Education , 20 , 849–871.

Kolstø, S. D., & Ratcliffe, M. (2008). Social aspects of argumentation. In S. Erduran & M. P. Jiménez-Aleixandre (Eds.), Argumentation in science education: Perspectives from classroom-based research (pp. 117–136). Dordrecht, the Netherlands: Springer.

Kolstø, S. D., Bungum, B., Arnesen, E., Isnes, A., Kristensen, T., Mathiassen, K., Mestad, et al. (2006). Science students’ critical examination of scientific information related to socio-scientific issues. Science Education , 90 , 632–655.

Kuhn, D. (1991). The skills of argument . Cambridge, MA: Cambridge University Press.

Book   Google Scholar  

Kuhn, D. (2005). Education for thinking . Cambridge, MA: Harvard University Press.

López-Facal, R., & Jiménez-Aleixandre, M. P. (2009). Identities, social representations and critical thinking. Cultural Studies of Science Education , 4, 689–695. doi 10.1007/s11422-008-9134-9.

Maloney, J. (2007). Children’s roles and use of evidence in science: An analysis of decision-making in small groups. British Educational Research Journal , 33 , 371–401.

Márquez, C., Prats, A., & Marbá, A. (2007, August). A critical reading of press advertisement in the science class . Paper presented at the European Science Education Research Association Conference, Malmo.

McCarthy, C. (1992). Why be critical? (Or rational or moral?) On the justification of critical thinking. In H. A. Alexander (Ed.), Philosophy of Education 1992: Proceedings of the Forty-Eighth Annual Meeting of the Philosophy of Education Society (pp. 60–68). Urbana, IL: Philosophy of Education Society.

Moscovici (1961–1976). La psychanalyse, son image et son public (2nd ed. revised). Paris: Presses Universitaires de France.

Norris, S. P. (1992). Bachelors, buckyballs and ganders: Seeking analogues for definitions of “critical thinker”. In H. A. Alexander (Ed.), Philosophy of Education 1992: Proceedings of the Forty-Eighth Annual Meeting of the Philosophy of Education Society (pp. 69–71). Urbana, IL: Philosophy of Education Society.

Norris, S. P. (1995). Learning to live with scientific expertise: Toward a theory of intellectual communalism for guiding science teaching. Science Education , 79 , 201–217.

Norris, S. P., & Korpan, C. A. (2000). Science, views about science, and pluralistic science education. In R. Millar, J. Leach, & J. Osborne (Eds.), Improving science education: The contribution of research (pp. 227–244). Buckingham, UK: Open University Press.

Osborne, J., Erduran, S., & Simon, S. (2004). Ideas, evidence and argument in science . London: King’s College London.

Perry, W. G. (1981). Cognitive and ethical growth: The making of meaning. In A. W. Chickering & Associates (Eds.), The modern American college (pp. 76–116). San Francisco: Jossey-Bass.

Puig, B., & Jiménez-Aleixandre, M. P. (2009). What do 9th grade students consider as evidence for or against claims about genetic differences in intelligence between black and white “races”? In M. Hammann, A. J. Waarlo & K. Boersma (Eds.), The nature of research in biological education: Old and new perspectives on theoretical and methodological issues (pp. 137–151). Utrecht: Utrecht University CD-ß Press.

Sadler, T. D., & Zeidler, D. L. (2005). Patterns of informal reasoning in the context of socio scientific decision-making. Journal of Research in Science Teaching , 42 , 112–138.

Siegel, H. (1988). Educating reason: Rationality, critical thinking and education . New York: Routledge.

Siegel, H. (1989). The rationality of science, critical thinking and science education. Synthese , 80 , 9–41.

Simonneaux, L., & Simonneaux, J. (2009). Students’ socio-scientific reasoning on controversies from the viewpoint of education for sustainable development. Cultural Studies of Science Education . doi 10.1007/s11422-008-9141-x.

Sóñora, F., García-Rodeja, I., & Brañas, M. (2001). Discourse analysis: Pupils’ discussions of soil science. In I. García-Rodeja, J. Díaz, U. Harms, & M. P. Jiménez-Aleixandre (Eds.), Proceedings of the 3rd ERIDOB Conference (pp. 313–326). Santiago de Compostela: University of Santiago de Compostela.

Stanisstreet, M., Spofforth N., & Williams, T. (1993). Attitudes of children to the uses of animals. International Journal of Science Education , 15 , 411–425.

Toulmin, S. (2001). Return to reason . Cambridge, MA: Harvard University Press.

Tytler, R., Duggan, S., & Gott, R. (2000). Dimensions of evidence, the public understanding of science and science education. International Journal of Science Education, 2 , 815–832.

Zeidler, D. L., & Sadler, T. D. (2008). The role of moral reasoning on argumentation: Conscience, character and care. In S. Erduran & M. P. Jiménez-Aleixandre (Eds.), Argumentation in science education: Perspectives from classroom-based research (pp. 201–216). Dordrecht, the Netherlands: Springer.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The effect of the biology critical thinking project on the development of critical thinking. Journal of Research in Science Teaching , 31 , 183–196.

Download references

Acknowledgements

This work was supported by the Spanish Ministerio de Educación y Ciencia (MEC), partly funded by the European Regional Development Fund (ERDF), code SEJ2006-15589-C02-01/EDUC. The authors are grateful to Glen Aikenhead for his valuable feedback on the first draft.

Author information

Authors and affiliations.

Didactica das Ciencias Experimentais, University of Santiago de Compostela, Santiago de Compostela, Spain

María Pilar Jiménez-Aleixandre & Blanca Puig

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to María Pilar Jiménez-Aleixandre .

Editor information

Editors and affiliations.

Science & Mathematics Education Centre, Curtin University of Technology, Perth, West Australia, Australia

Barry J. Fraser

The Graduate Centre, City University of New York, New York, 10016-4309, New York, USA

Kenneth Tobin

Ctr. Mathematics & Science Education, Queensland University of Technology, Victoria Park Rd., Kelvin Grove, 4059, Queensland, Australia

Campbell J. McRobbie

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media B.V.

About this chapter

Jiménez-Aleixandre, M.P., Puig, B. (2012). Argumentation, Evidence Evaluation and Critical Thinking. In: Fraser, B., Tobin, K., McRobbie, C. (eds) Second International Handbook of Science Education. Springer International Handbooks of Education, vol 24. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-9041-7_66

Download citation

DOI : https://doi.org/10.1007/978-1-4020-9041-7_66

Published : 23 November 2011

Publisher Name : Springer, Dordrecht

Print ISBN : 978-1-4020-9040-0

Online ISBN : 978-1-4020-9041-7

eBook Packages : Humanities, Social Sciences and Law Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Bookmark this page

  • International Center for the Assessment of Higher Order Thinking
  • Our Team of Presenters
  • Fellows of the Foundation
  • Dr. Richard Paul
  • Dr. Linda Elder
  • Dr. Gerald Nosich
  • Permission to Use Our Work
  • Create a CriticalThinking.Org Account
  • Contributions to the Foundation for Critical Thinking
  • Contact Us - Office Information
  • Testimonials
  • Center for Critical Thinking
  • The National Council for Excellence in Critical Thinking
  • Library of Critical Thinking Resources
  • Professional Development
  • The Center for Critical Thinking Community Online
  • Customized Webinars and Online Courses for Faculty
  • Certification in the Paul-Elder Approach to Critical Thinking
  • Consulting for Leaders and Key Personnel at Your Organization
  • K-12 Instruction
  • Higher Education
  • Business & Professional Groups
  • Build a Local Critical Thinking Town Hall
  • Critical Thinking Training for Law Enforcement
  • Online Courses for Instructors
  • Critical Thinking Therapy
  • Bring Critical Thinking Into Your Website's Discussion
  • The State of Critical Thinking Today
  • Professional Development Model for K-12
  • Professional Development Model - College and University
  • Workshop Descriptions
  • Mentor Program
  • Inservice Information Request Form
  • Institutions Using Our Approach to Critical Thinking
  • Conferences & Events
  • Upcoming Events in Critical Thinking
  • 44th Annual International Conference on Critical Thinking
  • Focal Session Descriptions
  • Daily Schedule
  • Call for Proposals
  • Presuppositions of the 44th Annual International Conference on Critical Thinking
  • Recommended Reading
  • 2024 Fall Academy on Critical Thinking
  • Academy Presuppositions
  • Conference Archives
  • 43rd Annual International Conference on Critical Thinking
  • Guest Presentation Program
  • Register as an Ambassador
  • Testimonials from Past Attendees
  • Thank You to Our Donors
  • Presuppositions of the Conference
  • 42nd Annual International Conference on Critical Thinking
  • Overview of Sessions (Flyer)
  • Presuppositions of the Annual International Conference
  • Testimonials from Past Conferences
  • 41st Annual International Conference on Critical Thinking
  • Recommended Publications
  • Dedication to Our Donors
  • 40th Annual International Conference on Critical Thinking
  • Session Descriptions
  • Testimonials from Prior Conferences
  • International Critical Thinking Manifesto
  • Scholarships Available
  • 39th Annual International Conference on Critical Thinking
  • Travel and Lodging Info
  • FAQ & General Announcements
  • Focal and Plenary Session Descriptions
  • Program and Proceedings of the 39th Annual International Conference on Critical Thinking
  • The Venue: KU Leuven
  • Call for Critical Thinking Ambassadors
  • Conference Background Information
  • 38th Annual International Conference on Critical Thinking
  • Call for Ambassadors for Critical Thinking
  • Conference Focal Session Descriptions
  • Conference Concurrent Session Descriptions
  • Conference Roundtable Discussions
  • Conference Announcements and FAQ
  • Conference Program and Proceedings
  • Conference Daily Schedule
  • Conference Hotel Information
  • Conference Academic Credit
  • Conference Presuppositions
  • What Participants Have Said About the Conference
  • 37th Annual International Conference on Critical Thinking
  • Registration & Fees
  • FAQ and Announcements
  • Conference Presenters
  • 37th Conference Flyer
  • Program and Proceedings of the 37th Conference
  • 36th International Conference
  • Conference Sessions
  • Conference Flyer
  • Program and Proceedings
  • Academic Credit
  • 35th International Conference
  • Conference Session Descriptions
  • Available Online Sessions
  • Bertrand Russell Distinguished Scholar - Daniel Ellsberg
  • 35th International Conference Program
  • Concurrent Sessions
  • Posthumous Bertrand Russell Scholar
  • Hotel Information
  • Conference FAQs
  • Visiting UC Berkeley
  • 34th INTERNATIONAL CONFERENCE
  • Bertrand Russell Distinguished Scholar - Ralph Nader
  • Conference Concurrent Presenters
  • Conference Program
  • Conference Theme
  • Roundtable Discussions
  • Flyer for Bulletin Boards
  • 33rd INTERNATIONAL CONFERENCE
  • 33rd International Conference Program
  • 33rd International Conference Sessions
  • 33rd International Conference Presenters
  • The Bertrand Russell Distinguished Scholars Critical Thinking Conversations
  • 33rd International Conference - Fees & Registration
  • 33rd International Conference Concurrent Presenters
  • 33rd International Conference - Hotel Information
  • 33rd International Conference Flyer
  • 32nd INTERNATIONAL CONFERENCE
  • 32nd Annual Conference Sessions
  • 32nd Annual Conference Presenter Information
  • 32nd Conference Program
  • The Bertrand Russell Distinguished Scholars Critical Thinking Lecture Series
  • 32nd Annual Conference Concurrent Presenters
  • 32nd Annual Conference Academic Credit
  • 31st INTERNATIONAL CONFERENCE
  • 31st Conference Sessions
  • Comments about previous conferences
  • Conference Hotel (2011)
  • 31st Concurrent Presenters
  • Registration Fees
  • 31st International Conference
  • 30th INTERNATIONAL CONFERENCE ON CRITICAL THINKING
  • 30th International Conference Theme
  • 30th Conference Sessions
  • PreConference Sessions
  • 30th Concurrent Presenters
  • 30th Conference Presuppositions
  • Hilton Garden Inn
  • 29th International Conference
  • 29th Conference Theme
  • 29th Conference Sessions
  • 29th Preconference Sessions
  • 29th Conference Concurrent Sessions
  • 2008 International Conference on Critical Thinking
  • 2008 Preconference Sessions (28th Intl. Conference)
  • 2007 Conference on Critical Thinking (Main Page)
  • 2007 Conference Theme and sessions
  • 2007 Pre-Conference Workshops
  • 2006 Annual International Conference (archived)
  • 2006 International Conference Theme
  • 2005 International Conference (archived)
  • Prior Conference Programs (Pre 2000)
  • Workshop Archives
  • Spring 2022 Online Workshops
  • 2021 Online Workshops for Winter & Spring
  • 2019 Seminar for Military and Intelligence Trainers and Instructors
  • Transportation, Lodging, and Recreation
  • Seminar Flyer
  • 2013 Spring Workshops
  • Our Presenters
  • 2013 Spring Workshops - Hotel Information
  • 2013 Spring Workshops Flyer
  • 2013 Spring Workshops - Schedule
  • Spring Workshop 2012
  • 2012 Spring Workshop Strands
  • 2012 Spring Workshop Flier
  • 2011 Spring Workshop
  • Spring 2010 Workshop Strands
  • 2009 Spring Workshops on Critical Thinking
  • 2008 SPRING Workshops and Seminars on Critical Thinking
  • 2008 Ethical Reasoning Workshop
  • 2008 - On Richard Paul's Teaching Design
  • 2008 Engineering Reasoning Workshop
  • 2008 Academia sobre Formulando Preguntas Esenciales
  • Fellows Academy Archives
  • 2017 Fall International Fellows Academy
  • 4th International Fellows Academy - 2016
  • 3rd International Fellows Academy
  • 2nd International Fellows Academy
  • 1st International Fellows Academy
  • Academy Archives
  • October 2019 Critical Thinking Academy for Educators and Administrators
  • Transportation, Lodging, and Leisure
  • Advanced Seminar: Oxford Tutorial
  • Recreational Group Activities
  • Limited Scholarships Available
  • September 2019 Critical Thinking Educators and Administrators Academy
  • 2019 Critical Thinking Training for Trainers and Advanced Academy
  • Academy Flyer
  • Seattle, WA 2017 Spring Academy
  • San Diego, CA 2017 Spring Academy
  • 2016 Spring Academy -- Washington D.C.
  • 2016 Spring Academy -- Houston, TX
  • The 2nd International Academy on Critical Thinking (Oxford 2008)
  • 2007 National Academy on Critical Thinking Testing and Assessment
  • 2006 Cambridge Academy (archived)
  • 2006 Cambridge Academy Theme
  • 2006 Cambridge Academy Sessions
  • Accommodations at St. John's College
  • Assessment & Testing
  • A Model for the National Assessment of Higher Order Thinking
  • International Critical Thinking Essay Test
  • Online Critical Thinking Basic Concepts Test
  • Online Critical Thinking Basic Concepts Sample Test
  • Consequential Validity: Using Assessment to Drive Instruction
  • News & Announcements
  • Newest Pages Added to CriticalThinking.Org
  • Online Learning
  • Critical Thinking Online Courses
  • Critical Thinking Blog
  • 2019 Blog Entries
  • 2020 Blog Entries
  • 2021 Blog Entries
  • 2022 Blog Entries
  • 2023 Blog Entries
  • Online Courses for Your Students
  • 2023 Webinar Archives
  • 2022 Webinar Archives
  • 2021 Webinar Archive
  • 2020 Webinar Archive
  • Guided Study Groups
  • Critical Thinking Channel on YouTube
  • CT800: Spring 2024

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

evaluate in critical thinking

Critical Thinking: Where to Begin

evaluate in critical thinking

  • For College and University Faculty
  • For College and University Students
  • For High School Teachers
  • For Jr. High School Teachers
  • For Elementary Teachers (Grades 4-6)
  • For Elementary Teachers (Kindergarten - 3rd Grade)
  • For Science and Engineering Instruction
  • For Business and Professional Development
  • For Nursing and Health Care
  • For Home Schooling and Home Study

If you are new to critical thinking or wish to deepen your conception of it, we recommend you review the content below and bookmark this page for future reference.

Our Conception of Critical Thinking...

getting started with critical thinking

"Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness..."

"Critical thinking is self-guided, self-disciplined thinking which attempts to reason at the highest level of quality in a fairminded way. People who think critically attempt, with consistent and conscious effort, to live rationally, reasonably, and empathically. They are keenly aware of the inherently flawed nature of human thinking when left unchecked. They strive to diminish the power of their egocentric and sociocentric tendencies. They use the intellectual tools that critical thinking offers – concepts and principles that enable them to analyze, assess, and improve thinking. They work diligently to develop the intellectual virtues of intellectual integrity, intellectual humility, intellectual civility, intellectual empathy, intellectual sense of justice and confidence in reason. They realize that no matter how skilled they are as thinkers, they can always improve their reasoning abilities and they will at times fall prey to mistakes in reasoning, human irrationality, prejudices, biases, distortions, uncritically accepted social rules and taboos, self-interest, and vested interest.

They strive to improve the world in whatever ways they can and contribute to a more rational, civilized society. At the same time, they recognize the complexities often inherent in doing so. They strive never to think simplistically about complicated issues and always to consider the rights and needs of relevant others. They recognize the complexities in developing as thinkers, and commit themselves to life-long practice toward self-improvement. They embody the Socratic principle: The unexamined life is not worth living , because they realize that many unexamined lives together result in an uncritical, unjust, dangerous world."

Why Critical Thinking?

evaluate in critical thinking

The Problem:

Everyone thinks; it is our nature to do so. But much of our thinking, left to itself, is biased, distorted, partial, uninformed, or down-right prejudiced. Yet the quality of our lives and that of what we produce, make, or build depends precisely on the quality of our thought. Shoddy thinking is costly, both in money and in quality of life. Excellence in thought, however, must be systematically cultivated.

A Brief Definition:

Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. The Result: 

  A well-cultivated critical thinker:

  • raises vital questions and problems, formulating them clearly and precisely;
  • gathers and assesses relevant information, using abstract ideas to interpret it effectively;
  • comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards;
  • thinks openmindedly within alternative systems of thought, recognizing and assessing, as need be, their assumptions, implications, and practical consequences; and
  • communicates effectively with others in figuring out solutions to complex problems.

Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It requires rigorous standards of excellence and mindful command of their use. It entails effective communication and problem-solving abilities, and a commitment to overcoming our native egocentrism and sociocentrism. Read more about our concept of critical thinking .

The Essential Dimensions of Critical Thinking

evaluate in critical thinking

Our conception of critical thinking is based on the substantive approach developed by Dr. Richard Paul and his colleagues at the Center and Foundation for Critical Thinking over multiple decades. It is relevant to every subject, discipline, and profession, and to reasoning through the problems of everyday life. It entails five essential dimensions of critical thinking:

At the left is an overview of the first three dimensions. In sum, the elements or structures of thought enable us to "take our thinking apart" and analyze it. The intellectual standards are used to assess and evaluate the elements. The intellectual traits are dispositions of mind embodied by the fairminded critical thinker. To cultivate the mind, we need command of these essential dimensions, and we need to consistently apply them as we think through the many problems and issues in our lives.

The Elements of Reasoning and Intellectual Standards

evaluate in critical thinking

To learn more about the elements of thought and how to apply the intellectual standards, check out our interactive model. Simply click on the link below, scroll to the bottom of the page, and explore the model with your mouse.

Why the Analysis of Thinking Is Important If you want to think well, you must understand at least the rudiments of thought, the most basic structures out of which all thinking is made. You must learn how to take thinking apart. Analyzing the Logic of a Subject When we understand the elements of reasoning, we realize that all subjects, all disciplines, have a fundamental logic defined by the structures of thought embedded within them. Therefore, to lay bare a subject’s most fundamental logic, we should begin with these questions:

evaluate in critical thinking

Going Deeper...

evaluate in critical thinking

The Critical Thinking Bookstore  

Our online bookstore houses numerous books and teacher's manuals , Thinker's Guides , videos , and other educational materials .  

Learn From Our Fellows and Scholars

Watch our Event Calendar , which provides an overview of all upcoming conferences and academies hosted by the Foundation for Critical Thinking. Clicking an entry on the Event Calendar will bring up that event's details, and the option to register. For those interested in online learning, the Foundation offers accredited online courses in critical thinking for both educators and the general public, as well as an online test for evaluating basic comprehension of critical thinking concepts . We are in the process of developing more online learning tools and tests to offer the community.  

Utilizing this Website

This website contains large amounts research and an online library of articles , both of which are freely available to the public. We also invite you to become a member of the Critical Thinking Community , where you will gain access to more tools and materials.  If you cannot locate a resource on a specific topic or concept, try searching for it using our Search Tool . The Search Tool is at the upper-right of every page on the website.

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

evaluate in critical thinking

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. Employers prioritize the ability to think critically—find out why, plus see how you can demonstrate that you have this ability throughout the job application process. 

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

 Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter, and during your interview.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Promote Your Skills in Your Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your  work history , include top critical thinking skills that accurately describe you. You can also include them in your  resume summary , if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand critical thinking skills in mind as you update your resume and write your cover letter. As you've seen, you can also emphasize them at other points throughout the application process, such as your interview. 

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with  analytical skills  can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns

Communication

Often, you will need to share your conclusions with your employers or with a group of colleagues. You need to be able to  communicate with others  to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing

Open-Mindedness

To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate that you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • Critical Thinking in Reading and Composition
  • Bloom's Taxonomy in the Classroom
  • Introduction to Critical Thinking
  • How To Become an Effective Problem Solver
  • Creativity & Creative Thinking
  • Higher-Order Thinking Skills (HOTS) in Education
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • 6 Skills Students Need to Succeed in Social Studies Classes
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • Types of Medical School Interviews and What to Expect
  • The Horse Problem: A Math Challenge
  • What to Do When the Technology Fails in Class
  • What Are Your Strengths and Weaknesses? Interview Tips for Teachers
  • A Guide to Business Letters Types
  • How to Practice Critical Thinking in 4 Steps
  • Landing Your First Teaching Job
  • Skip to content
  • Skip to search
  • Staff portal (Inside the department)
  • Student portal
  • Key links for students

Other users

  • Forgot password

Notifications

{{item.title}}, my essentials, ask for help, contact edconnect, directory a to z, how to guides, evaluation resource hub, evaluative thinking.

Evaluative thinking is a disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit.

The following video discusses evaluative thinking. It runs for 3:34 minutes.

Evaluation Capacity Building - Evaluative Thinking

Video transcript

A form of critical thinking

Evaluation is a form of critical thinking that involves examining evidence to make a judgement.

Evaluative claims have two parts: a conclusion and an explanation.

For example:

  • xyz was great, because?
  • xyz is disappointing, because?
  • xyz is a good way to go in this situation, because?

Drawing conclusions based on intuition is not evaluation. Neither is personal opinion, speculation or conjecture.

Each of us makes evaluative judgements every day. Sometimes these are quick assessments that don't matter much, like what to order for lunch. At other times we need to slow down our thought processes, weighing up all the factors carefully and making our deliberation transparent to others.

A disciplined approach

Evaluating a strategic direction or project in a school draws on similar thinking processes and mental disciplines as assessing student performance or recruiting a new staff member.

When we engage in evaluative thinking, we seek to:

  • suspend judgement , considering alternative explanations and allowing new evidence to change our mind
  • question assumptions , particularly about the pathway of cause and effect
  • select and develop solutions that are informed by a strong evidence base and are responsive to our context and priorities
  • value the lessons we can learn from all our experiences ? disappointments as well as triumphs
  • wrestle with questions of impact and effectiveness, not just activity and implementation
  • maximise the value of existing data sources already available to us, mindful of their limitations
  • work to improve the strength of our evidence base as we go.

Cognitive bias

Evaluative thinking helps us navigate the cognitive biases that cloud our judgement.

Cognitive bias occurs when our analysis of a situation is compromised by 'mental shortcuts' or patterns of thinking that place undue emphasis on a particular perspective.

Confirmation bias is one type of cognitive bias can easily compromise an evaluation. This is where the evaluator is already leaning towards a particular conclusion before they see the data. Without realising it, they then pay more attention to data that supports this position.

Although we may not be able to free ourselves from our cognitive biases, being aware of them is a good first step. The mental disciplines of evaluative thinking can help manage these biases, and to keep our reasoning sharp and convincing.

Read more about cognitive bias.

Develop evaluative thinking

Working openly with colleagues helps to develop evaluative thinking in ourselves and others. Evaluative thinking sometimes comes naturally, but at other times it can feel a bit challenging - even threatening. If we want to develop evaluative thinking in others, we first need to model it ourselves.

A good way to strengthen evaluative practice in schools is to engage in evaluative thinking as a group: deliberately, transparently and in a supportive context. In this way people have the time and space to reflect on their thinking. This is particularly important if we are to identify or 'unlearn' bad habits that we may have fallen into.

For example, the simple act of being asked 'What makes you think that?' prompts us to explain how we formed our judgements, including the evidence we have considered as part of this.

The importance of modelling and collaborative practice in evaluation is highlighted in the Australian Institute for Teaching and School Leadership's (AITSL ) profile relating to leading improvement, innovation and change . This profile encourages school leaders to develop 'a culture of continuous improvement' and 'a culture of trust and collaboration, where change and innovation based on research and evidence can flourish'.

As part of doing this, the Leadership Profile highlights the value of 'evaluating outcomes and refining actions as change is implemented? taking account of the impact of change on others, providing opportunities for regular feedback'.

Keep reading

  • Disciplines of evaluative thinking
  • Professional learning
  • Teaching and learning
  • Building capacity

Business Unit:

  • Centre for Education Statistics and Evaluation

Logo for OPEN OKSTATE

1 Introduction to Critical Thinking

I. what is c ritical t hinking [1].

Critical thinking is the ability to think clearly and rationally about what to do or what to believe.  It includes the ability to engage in reflective and independent thinking. Someone with critical thinking skills is able to do the following:

  • Understand the logical connections between ideas.
  • Identify, construct, and evaluate arguments.
  • Detect inconsistencies and common mistakes in reasoning.
  • Solve problems systematically.
  • Identify the relevance and importance of ideas.
  • Reflect on the justification of one’s own beliefs and values.

Critical thinking is not simply a matter of accumulating information. A person with a good memory and who knows a lot of facts is not necessarily good at critical thinking. Critical thinkers are able to deduce consequences from what they know, make use of information to solve problems, and to seek relevant sources of information to inform themselves.

Critical thinking should not be confused with being argumentative or being critical of other people. Although critical thinking skills can be used in exposing fallacies and bad reasoning, critical thinking can also play an important role in cooperative reasoning and constructive tasks. Critical thinking can help us acquire knowledge, improve our theories, and strengthen arguments. We can also use critical thinking to enhance work processes and improve social institutions.

Some people believe that critical thinking hinders creativity because critical thinking requires following the rules of logic and rationality, whereas creativity might require breaking those rules. This is a misconception. Critical thinking is quite compatible with thinking “out-of-the-box,” challenging consensus views, and pursuing less popular approaches. If anything, critical thinking is an essential part of creativity because we need critical thinking to evaluate and improve our creative ideas.

II. The I mportance of C ritical T hinking

Critical thinking is a domain-general thinking skill. The ability to think clearly and rationally is important whatever we choose to do. If you work in education, research, finance, management or the legal profession, then critical thinking is obviously important. But critical thinking skills are not restricted to a particular subject area. Being able to think well and solve problems systematically is an asset for any career.

Critical thinking is very important in the new knowledge economy.  The global knowledge economy is driven by information and technology. One has to be able to deal with changes quickly and effectively. The new economy places increasing demands on flexible intellectual skills, and the ability to analyze information and integrate diverse sources of knowledge in solving problems. Good critical thinking promotes such thinking skills, and is very important in the fast-changing workplace.

Critical thinking enhances language and presentation skills. Thinking clearly and systematically can improve the way we express our ideas. In learning how to analyze the logical structure of texts, critical thinking also improves comprehension abilities.

Critical thinking promotes creativity. To come up with a creative solution to a problem involves not just having new ideas. It must also be the case that the new ideas being generated are useful and relevant to the task at hand. Critical thinking plays a crucial role in evaluating new ideas, selecting the best ones and modifying them if necessary.

Critical thinking is crucial for self-reflection. In order to live a meaningful life and to structure our lives accordingly, we need to justify and reflect on our values and decisions. Critical thinking provides the tools for this process of self-evaluation.

Good critical thinking is the foundation of science and democracy. Science requires the critical use of reason in experimentation and theory confirmation. The proper functioning of a liberal democracy requires citizens who can think critically about social issues to inform their judgments about proper governance and to overcome biases and prejudice.

Critical thinking is a   metacognitive skill . What this means is that it is a higher-level cognitive skill that involves thinking about thinking. We have to be aware of the good principles of reasoning, and be reflective about our own reasoning. In addition, we often need to make a conscious effort to improve ourselves, avoid biases, and maintain objectivity. This is notoriously hard to do. We are all able to think but to think well often requires a long period of training. The mastery of critical thinking is similar to the mastery of many other skills. There are three important components: theory, practice, and attitude.

III. Improv ing O ur T hinking S kills

If we want to think correctly, we need to follow the correct rules of reasoning. Knowledge of theory includes knowledge of these rules. These are the basic principles of critical thinking, such as the laws of logic, and the methods of scientific reasoning, etc.

Also, it would be useful to know something about what not to do if we want to reason correctly. This means we should have some basic knowledge of the mistakes that people make. First, this requires some knowledge of typical fallacies. Second, psychologists have discovered persistent biases and limitations in human reasoning. An awareness of these empirical findings will alert us to potential problems.

However, merely knowing the principles that distinguish good and bad reasoning is not enough. We might study in the classroom about how to swim, and learn about the basic theory, such as the fact that one should not breathe underwater. But unless we can apply such theoretical knowledge through constant practice, we might not actually be able to swim.

Similarly, to be good at critical thinking skills it is necessary to internalize the theoretical principles so that we can actually apply them in daily life. There are at least two ways to do this. One is to perform lots of quality exercises. These exercises don’t just include practicing in the classroom or receiving tutorials; they also include engaging in discussions and debates with other people in our daily lives, where the principles of critical thinking can be applied. The second method is to think more deeply about the principles that we have acquired. In the human mind, memory and understanding are acquired through making connections between ideas.

Good critical thinking skills require more than just knowledge and practice. Persistent practice can bring about improvements only if one has the right kind of motivation and attitude. The following attitudes are not uncommon, but they are obstacles to critical thinking:

  • I prefer being given the correct answers rather than figuring them out myself.
  • I don’t like to think a lot about my decisions as I rely only on gut feelings.
  • I don’t usually review the mistakes I have made.
  • I don’t like to be criticized.

To improve our thinking we have to recognize the importance of reflecting on the reasons for belief and action. We should also be willing to engage in debate, break old habits, and deal with linguistic complexities and abstract concepts.

The  California Critical Thinking Disposition Inventory  is a psychological test that is used to measure whether people are disposed to think critically. It measures the seven different thinking habits listed below, and it is useful to ask ourselves to what extent they describe the way we think:

  • Truth-Seeking—Do you try to understand how things really are? Are you interested in finding out the truth?
  • Open-Mindedness—How receptive are you to new ideas, even when you do not intuitively agree with them? Do you give new concepts a fair hearing?
  • Analyticity—Do you try to understand the reasons behind things? Do you act impulsively or do you evaluate the pros and cons of your decisions?
  • Systematicity—Are you systematic in your thinking? Do you break down a complex problem into parts?
  • Confidence in Reasoning—Do you always defer to other people? How confident are you in your own judgment? Do you have reasons for your confidence? Do you have a way to evaluate your own thinking?
  • Inquisitiveness—Are you curious about unfamiliar topics and resolving complicated problems? Will you chase down an answer until you find it?
  • Maturity of Judgment—Do you jump to conclusions? Do you try to see things from different perspectives? Do you take other people’s experiences into account?

Finally, as mentioned earlier, psychologists have discovered over the years that human reasoning can be easily affected by a variety of cognitive biases. For example, people tend to be over-confident of their abilities and focus too much on evidence that supports their pre-existing opinions. We should be alert to these biases in our attitudes towards our own thinking.

IV. Defining Critical Thinking

There are many different definitions of critical thinking. Here we list some of the well-known ones. You might notice that they all emphasize the importance of clarity and rationality. Here we will look at some well-known definitions in chronological order.

1) Many people trace the importance of critical thinking in education to the early twentieth-century American philosopher John Dewey. But Dewey did not make very extensive use of the term “critical thinking.” Instead, in his book  How We Think (1910), he argued for the importance of what he called “reflective thinking”:

…[when] the ground or basis for a belief is deliberately sought and its adequacy to support the belief examined. This process is called reflective thought; it alone is truly educative in value…

Active, persistent and careful consideration of any belief or supposed form of knowledge in light of the grounds that support it, and the further conclusions to which it tends, constitutes reflective thought.

There is however one passage from How We Think where Dewey explicitly uses the term “critical thinking”:

The essence of critical thinking is suspended judgment; and the essence of this suspense is inquiry to determine the nature of the problem before proceeding to attempts at its solution. This, more than any other thing, transforms mere inference into tested inference, suggested conclusions into proof.

2) The  Watson-Glaser Critical Thinking Appraisal  (1980) is a well-known psychological test of critical thinking ability. The authors of this test define critical thinking as:

…a composite of attitudes, knowledge and skills. This composite includes: (1) attitudes of inquiry that involve an ability to recognize the existence of problems and an acceptance of the general need for evidence in support of what is asserted to be true; (2) knowledge of the nature of valid inferences, abstractions, and generalizations in which the weight or accuracy of different kinds of evidence are logically determined; and (3) skills in employing and applying the above attitudes and knowledge.

3) A very well-known and influential definition of critical thinking comes from philosopher and professor Robert Ennis in his work “A Taxonomy of Critical Thinking Dispositions and Abilities” (1987):

Critical thinking is reasonable reflective thinking that is focused on deciding what to believe or do.

4) The following definition comes from a statement written in 1987 by the philosophers Michael Scriven and Richard Paul for the  National Council for Excellence in Critical Thinking (link), an organization promoting critical thinking in the US:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness. It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue, assumptions, concepts, empirical grounding; reasoning leading to conclusions, implications and consequences, objections from alternative viewpoints, and frame of reference.

The following excerpt from Peter A. Facione’s “Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction” (1990) is quoted from a report written for the American Philosophical Association:

We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. CT is essential as a tool of inquiry. As such, CT is a liberating force in education and a powerful resource in one’s personal and civic life. While not synonymous with good thinking, CT is a pervasive and self-rectifying human phenomenon. The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fairminded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit. Thus, educating good critical thinkers means working toward this ideal. It combines developing CT skills with nurturing those dispositions which consistently yield useful insights and which are the basis of a rational and democratic society.

V. Two F eatures of C ritical T hinking

A. how not what .

Critical thinking is concerned not with what you believe, but rather how or why you believe it. Most classes, such as those on biology or chemistry, teach you what to believe about a subject matter. In contrast, critical thinking is not particularly interested in what the world is, in fact, like. Rather, critical thinking will teach you how to form beliefs and how to think. It is interested in the type of reasoning you use when you form your beliefs, and concerns itself with whether you have good reasons to believe what you believe. Therefore, this class isn’t a class on the psychology of reasoning, which brings us to the second important feature of critical thinking.

B. Ought N ot Is ( or Normative N ot Descriptive )

There is a difference between normative and descriptive theories. Descriptive theories, such as those provided by physics, provide a picture of how the world factually behaves and operates. In contrast, normative theories, such as those provided by ethics or political philosophy, provide a picture of how the world should be. Rather than ask question such as why something is the way it is, normative theories ask how something should be. In this course, we will be interested in normative theories that govern our thinking and reasoning. Therefore, we will not be interested in how we actually reason, but rather focus on how we ought to reason.

In the introduction to this course we considered a selection task with cards that must be flipped in order to check the validity of a rule. We noted that many people fail to identify all the cards required to check the rule. This is how people do in fact reason (descriptive). We then noted that you must flip over two cards. This is how people ought to reason (normative).

  • Section I-IV are taken from http://philosophy.hku.hk/think/ and are in use under the creative commons license. Some modifications have been made to the original content. ↵

Critical Thinking Copyright © 2019 by Brian Kim is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - What is Critical Thinking?

Critical thinking and decision-making  -, what is critical thinking, critical thinking and decision-making what is critical thinking.

GCFLearnFree Logo

Critical Thinking and Decision-Making: What is Critical Thinking?

Lesson 1: what is critical thinking, what is critical thinking.

Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. But when you stop to think about it, what exactly is critical thinking and how do you do it ?

Watch the video below to learn more about critical thinking.

Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions . It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better.

illustration of the terms logic, reasoning, and creativity

This may sound like a pretty broad definition, and that's because critical thinking is a broad skill that can be applied to so many different situations. You can use it to prepare for a job interview, manage your time better, make decisions about purchasing things, and so much more.

The process

illustration of "thoughts" inside a human brain, with several being connected and "analyzed"

As humans, we are constantly thinking . It's something we can't turn off. But not all of it is critical thinking. No one thinks critically 100% of the time... that would be pretty exhausting! Instead, it's an intentional process , something that we consciously use when we're presented with difficult problems or important decisions.

Improving your critical thinking

illustration of the questions "What do I currently know?" and "How do I know this?"

In order to become a better critical thinker, it's important to ask questions when you're presented with a problem or decision, before jumping to any conclusions. You can start with simple ones like What do I currently know? and How do I know this? These can help to give you a better idea of what you're working with and, in some cases, simplify more complex issues.  

Real-world applications

illustration of a hand holding a smartphone displaying an article that reads, "Study: Cats are better than dogs"

Let's take a look at how we can use critical thinking to evaluate online information . Say a friend of yours posts a news article on social media and you're drawn to its headline. If you were to use your everyday automatic thinking, you might accept it as fact and move on. But if you were thinking critically, you would first analyze the available information and ask some questions :

  • What's the source of this article?
  • Is the headline potentially misleading?
  • What are my friend's general beliefs?
  • Do their beliefs inform why they might have shared this?

illustration of "Super Cat Blog" and "According to survery of cat owners" being highlighted from an article on a smartphone

After analyzing all of this information, you can draw a conclusion about whether or not you think the article is trustworthy.

Critical thinking has a wide range of real-world applications . It can help you to make better decisions, become more hireable, and generally better understand the world around you.

illustration of a lightbulb, a briefcase, and the world

/en/problem-solving-and-decision-making/why-is-it-so-hard-to-make-decisions/content/

Critical thinking definition

evaluate in critical thinking

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Educational Membership icon

  • New! Member Benefit New! Member Benefit
  • Featured Analytics Hub
  • Resources Resources
  • Member Directory
  • Networking Communities
  • Advertise, Exhibit, Sponsor
  • Find or Post Jobs

Connect Icon

  • Learn and Engage Learn and Engage
  • Bridge Program

evaluate in critical thinking

  • Compare AACSB-Accredited Schools
  • Explore Programs

Bullseye mission icon

  • Advocacy Advocacy
  • Featured AACSB Announces 2024 Class of Influential Leaders
  • Diversity, Equity, Inclusion, and Belonging
  • Influential Leaders
  • Innovations That Inspire
  • Connect With Us Connect With Us
  • Accredited School Search
  • Accreditation
  • Learning and Events
  • Advertise, Sponsor, Exhibit
  • Tips and Advice
  • Is Business School Right for Me?

How to Evaluate Critical Thinking in the Age of AI

Article Icon

  • Many campuses have taken steps to curtail the use of generative AI tools, often over fears of plagiarism—but these fears overshadow AI’s potential as a pedagogical tool.
  • Because GenAI’s responses are immediate and nonjudgmental, learners can develop their critical thinking processes as they freely reflect on thoughts, responses, and concepts.
  • GenAI has not supplanted the role of instructors in the classroom. Rather, it has become a tool that we can use to teach, inspire, and guide our learners.

Learners have embraced generative artificial intelligence (GenAI), but academic administrators and faculty  appear to be more apprehensive  about using this emerging technology. Since GenAI began taking hold, administrators and faculty have set policies to restrict its use. They have used AI detectors to police plagiarism (despite the inconsistent capabilities of these tools), while their offices of integrity have doled out punishments.

But as educators have learned over the past year, these interventions won’t curtail the use of generative AI by learners. Moreover, we believe there are many reasons that educators should stop resisting this technology and start enjoying the benefits GenAI has to offer.

Before we offered anything close to a salve, we wanted to know: What are some of the  sources of apprehension  among our colleagues? The three of us have had productive conversations on this question with professors from various institutions. Through these conversations, we learned that most faculty were concerned about the same thing: plagiarism.

As we listened, we realized that plagiarism is merely an administrative term used by academic cultures. When we set rules prohibiting plagiarism, we create a policy safety net that allows us to teach and evaluate our students’ critical thinking. We want to know of our students: Is this your own thinking? Are these your own written words?

These questions lie at the heart of our anxiety. How can we evaluate a learner’s critical thinking if the content is AI-generated? While this is a fair question, we should be asking a different one: How can we use generative AI to develop our students’ critical thinking skills?

The Limitation of Traditional Teaching

Our answer here may surprise you. For example, prior to having chatbots in our own classroom, we provided learners with short scenarios focused on ethical dilemmas that entry-level professionals might encounter. Each learner would take 20 minutes to think through the dilemma, generate an overview, identify stakeholders, and decide what course of action to take. We would then spend the rest of the class time in discussion.

Our students enjoyed these thought challenges. As instructors, we recognized the effectiveness in getting future business leaders to think, write, and discuss potential moments of  ethical fading . We never graded these interactions, and learners never asked for points for their participation. Socrates would have been proud. With these class discussions, we had transcended transactional coursework.

In our classes, we encourage students to engage in conversations with the bots. Learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.

But these assignments had a significant limitation: It was difficult to measure whether all learners had pushed themselves to think critically and reflect deeply about the dilemma. As in any group discussion, some were more vocal than others. Even if called on, some learners would simply parrot previous responses. Moreover, these assignments were not designed to provide students with additional instructional feedback after the in-class discussions were over.

How could we address this limitation? How could we ascertain every learner’s depth of critical thinking through this exercise? Enter ChatGPT.

Conversing With the Bots

In an  October 2023 article  in AACSB Insights , Anthony Hié and Claire Thouary write that “the better students are at communicating with AI, the more likely it is that they will have seamless and rewarding learning experiences as they use AI to deepen their understanding of complex concepts, find solutions to problems, or explore new areas of knowledge.”

Yes, ChatGPT creates content; it can write essays, blogs, and even novels based on a simple prompt. But at the J. Whitney Bunting College of Business and Technology (CoBT) at Georgia College & State University in Milledgeville, we use it differently. Rather than worrying about how it might replace our teaching, we wanted to figure out  how it could improve student learning .

After all, chatbots are, at their core, dialogical. With this in mind, we guide our learners to engineer effective prompts. We encourage them to learn how to engage in conversations with the bots. In our classes, learners discover that GenAI can serve as a tutor, an intellectual sparring partner, and a personal instructor.

Learning Through Repetition

Let’s look, for instance, at how we now ask students to think through ethical dilemmas in an in-class assignment in our undergraduate business communications course. Before the class session starts, we send students a specific prompt. We instruct them to copy and paste the entire prompt into their own ChatGPT accounts.

It’s important to note that the prompt’s rules and steps tell the bot how to behave. When we write in the prompt, “Now, please follow these steps,” we are instructing the bot to follow those exact steps. The learner is identified as the “user” in this context.

Once the learner submits the prompt, ChatGPT will create an ethical dilemma for the learner, along with the three discussion questions and the required list of components the learner must address. Until the learner has answered the questions and provided the information itemized on the list, the system will continue to request that the learner satisfy all components. (These components are listed a through d , as noted below.)

Once the learner gives the required responses, ChatGPT then will become the expert debater and present a response that questions the learner’s stance by offering the opposite perspective. The student will then respond to that “debate,” and then ChatGPT will evaluate the learner’s final response.

Here is the prompt we use for this assignment:

Act as an expert professor in business ethics. Create an ethical dilemma that involves an entry-level finance employee.

Rule: The dilemma should be complex. Right versus wrong should not be explicit. Please do NOT provide analysis.

Now, please follow these steps:

1. Create three discussion questions.

2. After the user’s response, create three more questions, UNLESS the response does NOT include all the following components:

a) An overview of the ethical situation

b) A list of options

c) A list of stakeholders

d) A recommended action

3. If the responses are missing any of the components, please ask the user to provide the missing component.

4. If all the components are provided, then act as expert debater and present an opposite perspective.

5. Wait for a final statement from the user.

6. Once the user provides the final statement, evaluate the quality of the responses based on the detail of the user’s responses, user’s use of evidence, and ethical validity.

The prompt creates an individual dilemma, and learners must work through that dilemma step by step — this prompt focuses on finance, but we can modify to focus on any industry. The benefit to these in-class conversations with ChatGPT is that learners often go beyond initial levels of thinking about the ethicality of the dilemma.

In fact, learners reach secondary and tertiary levels of thinking. They ask themselves more nuanced questions: Why does a particular response matter? What are the implications of good or bad decisions? What learned concepts can be applied to making ethical decisions and acting ethically?

The point of critically writing through these dilemmas is not to bring about ethical epiphanies, since such epiphanies are hard to sustain. Instead, by regularly assigning these writing exercises, we want students to create muscle memory, as Brooke Deterline describes in her  2012 TED talk  on creating ethical cultures in business. Through such repetition, learners are more likely to acquire ethical reflexes that guard against the potential risks of ethical fading.

Learning Without Judgment

Another important benefit to using generative AI tools for critical thinking in the classroom is that each tool acts as a nonjudgmental collaborator. This means learners can converse with the tool, asking any question they want without the collaborator judging that question as “stupid” or “unworthy.”

GenAI’s nonjudgmental, in-depth responses ultimately help learners develop their own critical thinking processes, because the platform allows them to play with and reflect upon a variety of thoughts, responses, and concepts. They feel free to ask questions, challenge their own perspectives, and allow the bot to help shape and organize their thinking. We cannot overstate the value to learners of playing with questions, thoughts, and concepts.

Generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt.

In a  September 2023 article  published by Harvard Business Publishing–Education, Ethan Mollick and Lilach Mollick note that the instantaneous feedback from ChatGPT adds significant educational benefits. Learners often have attention and distraction issues, but AI tools can instantly generate feedback, which means learners don’t have to wait to see if their responses can be better developed.

Revolutionized, Not Replaced

As we have found, GenAI has not supplanted the role of instructors. On the contrary, after our students’ initial independent conversations with the bots, our facilitated class discussions are much more focused and informed. We can select one dilemma from the group and discuss it in detail, and those discussions are lively and provocative. Now, everyone has self-developed perspectives. We still find room to teach, inspire, and guide our learners.

To further ensure accountability, we require students to submit their conversations to our learning management system, a process that requires just the click of a button. We then can review and evaluate each learner’s response.

At the end of the day, generative AI isn’t going away. As Microsoft and Google integrate generative AI into their word processing software, instructors need to adapt. This is why the CoBT continues to expand its work with this technology. For instance, we have established an AI Lab, and we offer GenAI workshops for the campus and broader communities in Georgia. We continue to bring in industry leaders to engage our campus community on the topic, and we collaborate on AI projects with students and faculty outside the CoBT.

We must continue to innovate to make the best use of all AI has to offer. Let’s use this revolutionary tool to help ourselves and our learners become better thinkers—and better people.

  • artificial intelligence
  • critical thinking
  • digital transformation
  • future of business education
  • learner engagement
  • soft skills
  • Departments, units, and programs
  • College leadership
  • Diversity, equity, and inclusion
  • Faculty and staff resources
  • LAS Strategic Plan

Facebook

  • Apply to LAS
  • Liberal arts & sciences majors
  • LAS Insider blog
  • Admissions FAQs
  • Parent resources
  • Pre-college summer programs

Quick Links

Request info

  • Academic policies and standing
  • Advising and support
  • College distinctions
  • Dates and deadlines
  • Intercollegiate transfers
  • LAS Lineup student newsletter
  • Programs of study
  • Scholarships
  • Certificates
  • Student emergencies

Student resources

  • Access and Achievement Program
  • Career services
  • First-Year Experience
  • Honors program
  • International programs
  • Internship opportunities
  • Paul M. Lisnek LAS Hub
  • Student research opportunities
  • Expertise in LAS
  • Research facilities and centers
  • Dean's Distinguished Lecture series
  • Alumni advice
  • Alumni award programs
  • Get involved
  • LAS Alumni Council
  • LAS@Work: Alumni careers
  • Study Abroad Alumni Networks
  • Update your information
  • Nominate an alumnus for an LAS award
  • Faculty honors
  • The Quadrangle Online
  • LAS News email newsletter archive
  • LAS social media
  • Media contact in the College of LAS
  • LAS Landmark Day of Giving
  • About giving to LAS
  • Building projects
  • Corporate engagement
  • Faculty support
  • Lincoln Scholars Initiative
  • Impact of giving

Why is critical thinking important?

What do lawyers, accountants, teachers, and doctors all have in common?

Students in the School of Literatures, Languages, Cultures, and Linguistics give a presentation in a classroom in front of a screen

What is critical thinking?

The Oxford English Dictionary defines critical thinking as “The objective, systematic, and rational analysis and evaluation of factual evidence in order to form a judgment on a subject, issue, etc.” Critical thinking involves the use of logic and reasoning to evaluate available facts and/or evidence to come to a conclusion about a certain subject or topic. We use critical thinking every day, from decision-making to problem-solving, in addition to thinking critically in an academic context!

Why is critical thinking important for academic success?

You may be asking “why is critical thinking important for students?” Critical thinking appears in a diverse set of disciplines and impacts students’ learning every day, regardless of major.

Critical thinking skills are often associated with the value of studying the humanities. In majors such as English, students will be presented with a certain text—whether it’s a novel, short story, essay, or even film—and will have to use textual evidence to make an argument and then defend their argument about what they’ve read. However, the importance of critical thinking does not only apply to the humanities. In the social sciences, an economics major , for example, will use what they’ve learned to figure out solutions to issues as varied as land and other natural resource use, to how much people should work, to how to develop human capital through education. Problem-solving and critical thinking go hand in hand. Biology is a popular major within LAS, and graduates of the biology program often pursue careers in the medical sciences. Doctors use critical thinking every day, tapping into the knowledge they acquired from studying the biological sciences to diagnose and treat different diseases and ailments.

Students in the College of LAS take many courses that require critical thinking before they graduate. You may be asked in an Economics class to use statistical data analysis to evaluate the impact on home improvement spending when the Fed increases interest rates (read more about real-world experience with Datathon ). If you’ve ever been asked “How often do you think about the Roman Empire?”, you may find yourself thinking about the Roman Empire more than you thought—maybe in an English course, where you’ll use text from Shakespeare’s Antony and Cleopatra to make an argument about Roman imperial desire.  No matter what the context is, critical thinking will be involved in your academic life and can take form in many different ways.

The benefits of critical thinking in everyday life

Building better communication.

One of the most important life skills that students learn as early as elementary school is how to give a presentation. Many classes require students to give presentations, because being well-spoken is a key skill in effective communication. This is where critical thinking benefits come into play: using the skills you’ve learned, you’ll be able to gather the information needed for your presentation, narrow down what information is most relevant, and communicate it in an engaging way. 

Typically, the first step in creating a presentation is choosing a topic. For example, your professor might assign a presentation on the Gilded Age and provide a list of figures from the 1870s—1890s to choose from. You’ll use your critical thinking skills to narrow down your choices. You may ask yourself:

  • What figure am I most familiar with?
  • Who am I most interested in? 
  • Will I have to do additional research? 

After choosing your topic, your professor will usually ask a guiding question to help you form a thesis: an argument that is backed up with evidence. Critical thinking benefits this process by allowing you to focus on the information that is most relevant in support of your argument. By focusing on the strongest evidence, you will communicate your thesis clearly.

Finally, once you’ve finished gathering information, you will begin putting your presentation together. Creating a presentation requires a balance of text and visuals. Graphs and tables are popular visuals in STEM-based projects, but digital images and graphics are effective as well. Critical thinking benefits this process because the right images and visuals create a more dynamic experience for the audience, giving them the opportunity to engage with the material.

Presentation skills go beyond the classroom. Students at the University of Illinois will often participate in summer internships to get professional experience before graduation. Many summer interns are required to present about their experience and what they learned at the end of the internship. Jobs frequently also require employees to create presentations of some kind—whether it’s an advertising pitch to win an account from a potential client, or quarterly reporting, giving a presentation is a life skill that directly relates to critical thinking. 

Fostering independence and confidence

An important life skill many people start learning as college students and then finessing once they enter the “adult world” is how to budget. There will be many different expenses to keep track of, including rent, bills, car payments, and groceries, just to name a few! After developing your critical thinking skills, you’ll put them to use to consider your salary and budget your expenses accordingly. Here’s an example:

  • You earn a salary of $75,000 a year. Assume all amounts are before taxes.
  • 1,800 x 12 = 21,600
  • 75,000 – 21,600 = 53,400
  • This leaves you with $53,400
  • 320 x 12 = 3,840 a year
  • 53,400-3,840= 49,560
  • 726 x 12 = 8,712
  • 49,560 – 8,712= 40,848
  • You’re left with $40,848 for miscellaneous expenses. You use your critical thinking skills to decide what to do with your $40,848. You think ahead towards your retirement and decide to put $500 a month into a Roth IRA, leaving $34,848. Since you love coffee, you try to figure out if you can afford a daily coffee run. On average, a cup of coffee will cost you $7. 7 x 365 = $2,555 a year for coffee. 34,848 – 2,555 = 32,293
  • You have $32,293 left. You will use your critical thinking skills to figure out how much you would want to put into savings, how much you want to save to treat yourself from time to time, and how much you want to put aside for emergency funds. With the benefits of critical thinking, you will be well-equipped to budget your lifestyle once you enter the working world.

Enhancing decision-making skills

Choosing the right university for you.

One of the biggest decisions you’ll make in your life is what college or university to go to. There are many factors to consider when making this decision, and critical thinking importance will come into play when determining these factors.

Many high school seniors apply to colleges with the hope of being accepted into a certain program, whether it’s biology, psychology, political science, English, or something else entirely. Some students apply with certain schools in mind due to overall rankings. Students also consider the campus a school is set in. While some universities such as the University of Illinois are nestled within college towns, New York University is right in Manhattan, in a big city setting. Some students dream of going to large universities, and other students prefer smaller schools. The diversity of a university’s student body is also a key consideration. For many 17- and 18-year-olds, college is a time to meet peers from diverse racial and socio-economic backgrounds and learn about life experiences different than one’s own.

With all these factors in mind, you’ll use critical thinking to decide which are most important to you—and which school is the right fit for you.

Develop your critical thinking skills at the University of Illinois

At the University of Illinois, not only will you learn how to think critically, but you will put critical thinking into practice. In the College of LAS, you can choose from 70+ majors where you will learn the importance and benefits of critical thinking skills. The College of Liberal Arts & Sciences at U of I offers a wide range of undergraduate and graduate programs in life, physical, and mathematical sciences; humanities; and social and behavioral sciences. No matter which program you choose, you will develop critical thinking skills as you go through your courses in the major of your choice. And in those courses, the first question your professors may ask you is, “What is the goal of critical thinking?” You will be able to respond with confidence that the goal of critical thinking is to help shape people into more informed, more thoughtful members of society.

With such a vast representation of disciplines, an education in the College of LAS will prepare you for a career where you will apply critical thinking skills to real life, both in and outside of the classroom, from your undergraduate experience to your professional career. If you’re interested in becoming a part of a diverse set of students and developing skills for lifelong success, apply to LAS today!

Read more first-hand stories from our amazing students at the LAS Insider blog .

  • Privacy Notice
  • Accessibility

UGC NET 2024 Paper 1: List of topics you must prepare for

Apr 26, 2024

Teaching Aptitude

Teaching methods encompass various strategies such as lectures, discussions, and demonstrations, each suited to different learning objectives and audiences. A good teacher possesses qualities like patience, effective communication skills, and adaptability to engage and inspire learners.

Image Source: Canva

Classroom Management

Effective classroom management techniques are essential for maintaining a productive learning environment. This includes establishing clear expectations, managing behavior, and fostering positive relationships among students.

Evaluation Methods

Assessment methods such as assignments and tests are crucial for measuring students' understanding and progress. Choosing appropriate evaluation techniques aligned with learning objectives ensures fair and accurate assessment.

Learner's Characteristics

Understanding Piaget's stages of cognitive development helps educators tailor instruction to meet students' cognitive abilities. Recognizing diverse learning styles, including auditory, visual, and kinesthetic, allows for differentiated instruction to cater to individual needs.

Individual Differences

Every learner is unique, with varying abilities, backgrounds, and motivations. Acknowledging and addressing these differences is vital for creating inclusive learning environments and fostering student success.

Factors Affecting Teaching

Effective curriculum design is fundamental to facilitating meaningful learning experiences. Integrating teaching aids and technology enhances engagement and understanding, while the learning environment and external influences like parental involvement play significant roles in shaping learning outcomes.

You may also like

Infrastructure & learning environment.

The physical and social environment in which learning occurs significantly impacts student engagement and achievement. Providing adequate infrastructure and cultivating a supportive learning atmosphere are essential for maximizing learning potential

Teaching Methods

Utilising e-learning platforms like SWAYAM and MOOCs expands access to educational resources and promotes self-directed learning. Group discussions and collaborative learning activities foster critical thinking, communication skills, and peer interaction.

Problem-Solving and Critical Thinking

Encouraging problem-solving methods and critical thinking skills development empowers learners to analyze situations, explore alternatives, and make informed decisions. Balancing learner-centered and teacher-centered approaches fosters active engagement and deep understanding.

Research Aptitude

Differentiating between quantitative and qualitative research methodologies informs effective research design and data collection techniques. Upholding research ethics and avoiding plagiarism ensures the integrity and credibility of scholarly inquiry.

Thanks For Reading!

Next: Meet Nilkrishna Gajare: A Farmer's Son Who Topped JEE Main 2024

IMAGES

  1. Guide to improve critical thinking skills

    evaluate in critical thinking

  2. What is critical thinking?

    evaluate in critical thinking

  3. How To Improve Critical Thinking Skills at Work in 6 Steps

    evaluate in critical thinking

  4. why is Importance of Critical Thinking Skills in Education

    evaluate in critical thinking

  5. Critical Thinking Definition, Skills, and Examples

    evaluate in critical thinking

  6. Tools Of Critical Thinking

    evaluate in critical thinking

VIDEO

  1. Critical Thinking skills||Unit-2||English-V||Bsn 5th semester||Full Unit||In Urdu/English

  2. How to Master the Art of Critical Thinking🎭

  3. skills4studycampus: Critical Thinking Skills

  4. The Hidden Secret to Developing Critical Thinking

  5. Logic Freshman Chapter Four

  6. Give me a WOOD, I WILL GIVE YOU HEAT

COMMENTS

  1. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  2. Critical Thinking

    Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at ...

  3. Critical Thinking

    Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. The goal of this process is to help us have good beliefs, where "good" means that our beliefs meet certain goals of thought, such as truth, usefulness, or rationality. Critical thinking is widely ...

  4. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  5. PDF The Miniature Guide to Critical Thinking: Concepts & Tools

    The Foundation for Critical Thinking www.criticalthinking.org 707-878-9100 [email protected] By Dr. Richard Paul and Dr. Linda Elder Critical Thinking ... Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. The Result: A well cultivated critical thinker: •ises vital questions and problems ...

  6. What is critical thinking?

    Critical thinking is a kind of thinking in which you question, analyse, interpret , evaluate and make a judgement about what you read, hear, say, or write. The term critical comes from the Greek word kritikos meaning "able to judge or discern". Good critical thinking is about making reliable judgements based on reliable information.

  7. What Is Critical Thinking?

    Critical thinking is a complex process of deliberation that involves a wide range of skills and attitudes. It includes: identifying other people's positions, arguments and conclusions evaluating the evidence for alternative points of view; weighing up the opposing arguments and evidence fairly; being able to read between the lines, seeing behind surfaces and identifying false or unfair assumptions

  8. Frontiers

    Enhancing students' critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves ...

  9. Defining and Teaching Evaluative Thinking:

    To that end, we propose that ET is essentially critical thinking applied to contexts of evaluation. We argue that ECB, and the field of evaluation more generally, would benefit from an explicit and transparent appropriation of well-established concepts and teaching strategies derived from the long history of work on critical thinking.

  10. Argumentation, Evidence Evaluation and Critical Thinking

    Using this frame, the chapter examines the contributions of argumentation in science education to the components of critical thinking, and also discusses the evaluation of evidence and the different factors influencing or even hampering it. The chapter concludes with consideration of the development of critical thinking in the science classroom.

  11. Assessing Critical Thinking in Higher Education: Current State and

    Critical thinking is one of the most frequently discussed higher order skills, believed to play a central role in logical thinking, decision making, and problem solving (Butler, 2012; Halpern, 2003).It is also a highly contentious skill in that researchers debate about its definition; its amenability to assessment; its degree of generality or specificity; and the evidence of its practical ...

  12. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the ...

  13. Critical Thinking: Where to Begin

    A Brief Definition: Critical thinking is the art of analyzing and evaluating thinking with a view to improving it. A well-cultivated critical thinker: communicates effectively with others in figuring out solutions to complex problems. Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking.

  14. Critical Thinking Definition, Skills, and Examples

    Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful ...

  15. PDF Evaluating Critical Thinking Skills: Two Conceptualizations

    literature on critical thinking, however, suggests that many college stu-dents are not performing well on critical thinking tasks (King & Kitchener, 1994; Paul, 1993). Adding to the difficulty is the fact that there is no generally accepted definition of critical thinking; nor is there a generally accepted model to evaluate critical thinking.

  16. Evaluative thinking

    A disciplined approach. Evaluative thinking is a disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit. Evaluating a strategic direction or project in a school draws on similar thinking processes and mental disciplines as assessing student performance or recruiting ...

  17. Introduction to Critical Thinking

    Critical thinking is the ability to think clearly and rationally about what to do or what to believe. It includes the ability to engage in reflective and independent thinking. Someone with critical thinking skills is able to do the following: Understand the logical connections between ideas. Identify, construct, and evaluate arguments.

  18. Critical Thinking and Decision-Making

    Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. ... Let's take a look at how we can use critical thinking to evaluate online information. Say a friend of yours posts a news article on social media and you're drawn ...

  19. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  20. 6 Main Types of Critical Thinking Skills (With Examples)

    There are six main skills you can develop to successfully analyze facts and situations and come up with logical conclusions: 1. Analytical thinking. Being able to properly analyze information is the most important aspect of critical thinking. This implies gathering information and interpreting it, but also skeptically evaluating data.

  21. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and ...

  22. How to Evaluate Critical Thinking in the Age of AI

    If all the components are provided, then act as expert debater and present an opposite perspective. 5. Wait for a final statement from the user. 6. Once the user provides the final statement, evaluate the quality of the responses based on the detail of the user's responses, user's use of evidence, and ethical validity.

  23. Evaluating Critical Thinking

    Education, Computer Science. 2004. TLDR. The creation of an instrument for use by instructors, students, or researchers to identify, measure or promote critical thinking in online asynchronous discussions (OADs) revealed that while the instrument was valuable in identifying and measuring CT in the OAD, issues of practicality need to be addressed.

  24. Why is critical thinking important?

    The Oxford English Dictionary defines critical thinking as "The objective, systematic, and rational analysis and evaluation of factual evidence in order to form a judgment on a subject, issue, etc." Critical thinking involves the use of logic and reasoning to evaluate available facts and/or evidence to come to a conclusion about a certain ...

  25. UGC NET 2024 Paper 1: List of topics you must prepare for

    Problem-Solving and Critical Thinking. Encouraging problem-solving methods and critical thinking skills development empowers learners to analyze situations, explore alternatives, and make informed ...