Categories
Knowledge Philosophy Resources

What Does Critical Thinking Mean in the 21st Century?

This article is a continuation of What is Critical Thinking?

Critical Thinking as the Solution to 21st Century Problems?

The Democratisation of Knowledge and Misinformation

The 21st century, often referred to as the “Digital Information Age,” has achieved something truly unprecedented—today anyone with a basic Internet connection is granted free and instant access to a bottomless repository of information. The utopia where knowledge becomes readily accessible to everyone, regardless of class and race, is finally here. We are living in it right now.

And yet, as with everything, the democratisation of knowledge has not been without unintended consequences. In addition to striking down traditional barriers to knowledge, the Internet also struck down traditional standards of quality for information, effectively making the golden age of knowledge also the golden age of widespread misinformation, fake news, and charlatanism.

Man reading newspaper

With the lack of proper verification procedures and quality control, it is ultimately up to individuals themselves to decide what is trustworthy and what is not. This is why critical thinking is said to be more important now than ever before: without it, we would lose our capacity to tell true from false.

My question is the following: what does critical thinking in the 21st century mean? Is critical thinking merely the solution to the problems we’re facing in the 21st century? Or is the increasing prominence of this term in the current social discourse suggestive that it has itself turned into a problem today?


The Battle Between Authentic and Inauthentic Thinkers

Today, it seems, the stakes have been raised. The battle is no longer black and white: between thought and prejudice, fact and ideology, reason and dogma, but rather between two different strands of thought, both claiming to be “critical.” For lack of better terms, I propose to call one “authentic” and the other “inauthentic,” for reasons I will discuss later.

By inauthentic critical thinkers, I am referring to, among others, science deniers, contemporary fundamentalists, and especially conspiracy theorists. 

What strikes me about them is not so much their dogmatism and delusions, rather, it is how they’ve managed in recent years to appropriate the discourse on critical thinking in service of their own agendas.

Critical Thinking in the 21st Century Conspiracy Theories

For example, there is a common misconception that belief in conspiracy theories begins with a dogmatic leap of faith when, in reality, it usually begins with something very close to critical scepticism. It is simply not true that conspiracy theorists go around forcing their wild conjectures down people’s throats. More often than not, their opening move is much more modest and reasonable, at least on the surface: all they seem to do is advise people not to “blindly trust authorities” and instead “do their own research.”

COVID-19 research and conspiracy theories

Didn’t we hear so many people proudly proclaim during the COVID-19 pandemic that they “did their own research”? And that what they had found led them to be sceptical of what many of the experts, authorities, and public officials were saying? Surely enough, the deeper they dug, the more “research” they did, the more questions and suspicions they began to accumulate regarding every aspect of the pandemic: the existence of COVID-19, the safety of vaccines, the credibility of so-called “experts” and the media, and so on.

Wasn’t COVID-19 really a plot by Big Pharma to sell the public overpriced vaccines? If not, why is the pharmaceutical industry making billions of dollars in profit from the pandemic? Why are they suppressing and stigmatising alternative medications? Also, isn’t it convenient that the pandemic gave governments all around the world a pretext to expand mass surveillance and buttress state control? What could have been their real motives and intentions? And so on, and so forth. The questions multiply infinitely. 

Of course, I am not saying that those in power were fully transparent during the pandemic or that we should naïvely trust that they had no ulterior motives behind their benign policies.

The point is that it’s not so easy to articulate in this case what type of scepticism constitutes genuine critical thinking and what type of scepticism constitutes conspiratorial thinking, if there is a distinction to be made at all. Perhaps the uncomfortable fact we’ll have to come to accept is that they share exactly the same roots. 

The problem here is not so much about telling which theories are true and which are false. What I would like to draw attention to instead is the self-identification of many conspiracy theorists, science deniers, and even fundamentalists with the task of critical thinking. This is why I argue that the black-and-white opposition between critical and non-critical thinking is no longer tenable today.

Even if it’s true that the people who fall for conspiracy theories are not as critical as they believe (which is almost certainly the case), the fact that they openly wave the flag of critical thinking in public should worry us nonetheless. For, it is precisely in the name of “critical thinking” carried out independently of the influence of mainstream narratives that allows these people to discredit well-established facts like the existence of COVID-19, the imminent threat of climate change, the actual shape of the Earth, the empirical evidence for evolution by natural selection, and so on, to name a few.

Conspiracy Theorists and Their Search for Truth

In a strange and perverse way, couldn’t we say that the motto conspiracy theorists live by is exactly that of the Enlightenment’s—“Sapere aude!” (“Have the courage to use your own understanding!”) from Kant’s seminal 1784 text, “Was ist Aufklärung?” (“What is Enlightenment?”)?

In the eyes of conspiracy theorists, the masses are too lazy to search for the truth, too cowardly to think freely and independently of the mainstream media. They agree that many gladly remain minors for life, unaware of how they are being manipulated left and right by big corporations and scientific experts. 

Now, it’s obvious that we have a problem here.

The reality is that, under some circumstances, conspiracy theorists do appear to be more critical than the average ordinary citizen.

We may comfortably mock those who believe that the Earth is flat, but if we’re being honest with ourselves, there is a good chance that many of us would lose in a real debate with them. We could, of course, blame our defeat on mere sophistry that should be dismissed out of hand rather than engaged with sincerely.

Lady thinking

But perhaps the conspiracy theorist has a point: how many of us have actually analysed the scientific evidence behind the shape of the Earth? How many of us are capable of justifying on the spot why we believe the Earth to be round and not flat? Don’t we believe the Earth to be round simply because our teachers and textbooks said so? And if that’s the case, then aren’t we the naïve and gullible ones here? Aren’t we the ones who are uncritical? 

This series of questions takes us to the heart of the problem: namely, the relationship between the critical self and authority. Is it a necessary condition of critical thinking that we distrust and disregard all forms of epistemic authority? Isn’t it the case that every appeal to authority is ipso facto a fallacy? How can one claim to think independently and be dependent on authorities at the same time? Doesn’t this amount to some kind of contradiction? 

To begin responding to these questions, we must first examine more closely the different justifications given by “authentic” and “inauthentic” critical thinkers (keeping in mind these labels are only provisional at this stage) for distrusting authority. So, what’s wrong with authority?

The Basis for Distrusting Authority

The standard (authentic) argument can be summed up as follows: if it is necessary for us as critical thinkers to distrust what authorities say, then this is ultimately because we recognise that authorities are human beings just like the rest of us and therefore just as capable of error and prejudice. No one, regardless of their expertise or status, is all-knowing or all-objective. There will always be personal blindspots, oversights, and distortions which will have to be corrected by others. Fallibility is thus the main justification for distrusting authority. 

But it is not why conspiracy theorists, science deniers, and fundamentalists distrust authority. Their justification, on the other hand, is based on the fundamentally dishonest nature of authority.

Dishonesty attributes a malicious intent to those in power in a way that fallibility doesn’t necessarily: this is most clearly seen in conspiracy theories where the government, the media, the education system, etc., are all “in it together” to sabotage public interest and to cover up “the Truth.” 

The Root of Inauthentic Thinking: Conspiracy Theorists View Authority as Infallible

In the world of the conspiracy theorist, authority is never fallible; on the contrary, those in power are all too competent. Just think about how much continual and impeccable effort it must require to sustain the perfect illusion that the Earth is round when it is actually flat, or how much punctilious thought and consideration it would take to additionally plant red herrings that distract the public from “the Truth.” In their world, authorities never make mistakes—and that is precisely what makes them so dangerous and untrustworthy.

As a result, nothing can be trusted or taken for granted; everything, no matter how trivial or apparently insignificant, always has a deeper meaning that can be traced back and integrated into one overarching hyper-consistent narrative—it’s all part of “one big plot.” 

It is hopefully evident by now what makes this type of distrust in authority inauthentic. Despite all appearances, conspiracy theorists are not actually critical in the sense that they recognise the inherent limits of authority. The axiomatic presumption they hold, namely that the authorities are constantly trying to deceive us, is not only unreasonable but also far too naïve and simplistic.

evil demon

It divides the world neatly into good guys and bad guys, truth-tellers and deceivers. It is here that we find, paradoxically, that the most radical scepticism ends up coinciding with the most dogmatic beliefs. By flatly disregarding everything authorities have to say on the presumption that they’re dishonest and corrupt, one immediately places oneself in the comfortable position of knowing how everything really stands.

In classifying all claims by authority as a priori deceiving, this leaves nothing left in this world that is trustworthy except for the conspiracy theory which recognises and points out this deception, thereby proving itself to stand above the world it criticises. But there is nothing actually critical about this criticism, for it fails to confront the complications inherent within truth itself.

For the conspiracy theorist, the only obstacle preventing us from accessing the truth is the authorities’ deception; by removing the veil of deception, we automatically gain access to the unambiguous truth that was there all along. In this respect, there is really no critical analysis of what we take to be true at all; there is no sense in which the truth we understand can be wrong since whatever is wrong must be due to the authorities’ deception.

In reality, this whole narrative of deception functions as a coping mechanism so that the conspiracy theorist never has to confront the ambiguities of actual thinking.

For all their apparent scepticism and critical spirit, conspiracy theorists can hardly tolerate ambiguity; like everyone else, they crave certainty. When experts disagree or contradict each other in public—which they always do—the conspiracy theorist is more likely to take this as evidence of infighting or a cover-up rather than recognising how this is part and parcel of the process of establishing consensus.

They would much rather have the expert be an all-powerful, all-evil deceiver rather than somebody who makes mistakes. It is much more comfortable knowing that you’re being deceived and knowing what you’re being deceived about rather than not knowing if you and others could be wrong. By denying the fallibility inherent to authority, this also ensures I never have to confront my own fallibility. 

Here we can notice how the two justifications for distrusting authority end up implying two different attitudes of the critical self. There is something deeply false or inauthentic about the absolute independence of thought conspiracy theorists like to espouse.

If anything, their idea of independence is more arrogant than it is critical. To be fair, it is true in principle that anyone living in our information age has little excuse not to do their own research and to come to their own conclusions, but we all know that things are hardly so straightforward in practice.

The reality is that most people lack proper research skills, and so, what “doing your own research” effectively amounts to for these people is performing a few Google searches already primed by confirmation bias before falling into a rabbit hole of ready-made, readily-consumable conspiracy theories.

What these people don’t seem to realise is that everybody has their own limitations, and that being critical entails not only being critical of others but also being critical of oneself. The former without the latter is just arrogance. Authentic distrust of authority, insofar as it’s consistent, must always imply distrust of ourselves as authority too.


How to Practice Authentic Critical Thinking

But how might we put ourselves in a legitimate position to criticise ourselves? Isn’t this a case of trying to pull ourselves up by our own bootstraps? Here is where we should reassess our relationship with authority. If authentic critical thinking must necessarily be critical of itself, then, paradoxically perhaps, it must also depend on some form of authority outside itself.

The Importance of Relative Authority

Of course, the kind of authority I’m suggesting here is not the stereotypical absolute authority, i.e., an authority that authorises the truth by virtue of being an authority. What I have in mind is something much more modest, namely the concept of relative authorities, i.e., people who probably know better than us about a particular subject matter.

Taken in this context, authority is not so much something one claims to be but rather something one has relative to somebody else. There is nothing inherently special about this authority one can possess; it is not a privilege bestowed upon select individuals. 

Anyone can, in principle, come to understand for themselves the reasons and justifications behind what relative authorities claim and, in so doing, dissolve the apparent coercive force of their assertions.

Authority in this sense merely signifies what lies outside of our current knowledge and understanding, which we must admit is imperfect. Authorities can be wrong, naturally, but so can we. The dependence in this case is thus anything but one-sided.

Instead of a hierarchy where knowledge is concentrated at the top, in the hands of a few experts, the concept of relative authority compels us to imagine a network of critical thinkers where nobody has all the knowledge and power and where everybody is equally fallible and dependent on others to correct them.

What We Need In the 21st Century: A Public Sphere

networking of thinkers discussing

There is a name for this kind of network, it’s called a public sphere. While there is something to be said about critical thinking as a solitary withdrawal from the business of everyday life, the concept of the public sphere is not the simple reintroduction of the thinking self back into the public world but precisely the constitution of a world of the withdrawn.

A healthy public sphere is one in which all rational citizens are allowed the full freedom to communicate their thoughts with each other, to experiment with various perspectives disinterestedly, as well as point out the shortcomings and blindspots in each other’s thinking. And perhaps this is just what we need in the 21st century: not just more critical thinking but authentic critical thinking practised collectively within a well-functioning public sphere.

The Malaysian Philosophy Society actively promotes the cultivation of a network of critical thinkings in the community. Join the community here to spread the spirit of philosophy and critical thinking.

*Disclaimer: This article has been edited for clarity. The views and opinions expressed in this article are those of the author and do not necessarily reflect the stance of the Malaysian Philosophy Society.

Categories
Knowledge Philosophy Resources

Thinking Critically About Critical Thinking: What Really Is Critical Thinking?

Is Critical Thinking A Skill?

We appear to be living in a time when critical thinking enjoys extraordinary traction. Yet perhaps it is worth pausing for a moment here to reflect and wonder if, amid all the enthusiasm and lip service, people actually think more critically today or if they ever think about what it means to think critically. But how does one go about thinking critically about critical thinking? What is Critical Thinking?

critical thinking as a 21st century skill

What we constantly hear from the media, educators, politicians, even corporate leaders is that critical thinking is an essential skill in the 21st century. Such a proposition opens virtually every discussion on critical thinking and is accordingly treated as self-evident—it has the form of “it is undeniable that . . . .” However, I have precisely chosen not to begin with this proposition, as my intention is to problematise its assumptions and implications. The predicate of this ubiquitous proposition, “skill”, strikes me. Why is critical thinking a skill? Is thinking a skill?


What is Critical Thinking?

What is critical thinking? What kind of thing is it? This question is hardly ever formulated and even more rarely contemplated. The reason for this seems clear: everyone already knows what thinking is—it’s a skill, just like leadership, collaboration, communication, and so on, are skills.

But why is it a skill? Indeed, why is everything spontaneously considered a skill today? Why is it natural to assume that the role of education is to equip students with the proper skills needed to successfully navigate the spheres of economic and social life?

Malaysian education system on critical thinking
Vecteezy/Benis Arapovic

It has often been repeated that children must be taught how to think, not what to think—a principle which anyone with good sense agrees with. And yet, I claim, the choice of rhetorical device in this slogan ends up introducing a certain ambiguity to its implications.

For, by juxtaposing “how to think” with “what to think,” and by further emphasising that it is the “how” that pre-eminently concerns the task of education, in a way, it creates the opportunity for a conceptual sleight of hand: the belief that we should be teaching “how to think” quickly becomes the belief that “thinking is a how.”

In other words, we run the risk of reducing thinking to a mere instrumental activity, to a form without content, to a means indifferent to ends. 

Instrumentalisation of Reason

Of course, this is not a new problem. Already in the 1940s, Frankfurt School critical theorists like Adorno and Horkheimer had identified and critiqued a prevalent phenomenon at the time, which they referred to as “the instrumentalisation of reason.” 

In his 1947 book Eclipse of Reason, Horkheimer details how modernity has progressively reduced reason to nothing more than “the faculty of classification, inference, and deduction, no matter what the specific content” or “the ability to calculate probabilities and thereby to co-ordinate the right means with a given end.”

Horkheimer calls this post-Enlightenment conception of reason “subjective (instrumental) reason,” which he opposes to the “objective reason” exemplified in the Greek concept of logos. According to subjective reason, there are no such things as rational ends, only rational means to achieve subjective ends. The discussion over the rational value of ends is meaningless because the selection of ends depends ultimately on the individual and his preferences. 

The Origin of Reason (Logos)

Natural as this view appears to us today, it is important to note how contrary it runs to so many of the dominant currents of philosophical thought from Plato and Aristotle up to the pioneers of the Enlightenment.

In the ancient world, reason (logos) was thought to inhere not only in the minds of rational subjects but also in the structure of objective reality itself. Reason used to have an objective and normative dimension which it has lost today in becoming something subjective and instrumental.

The Greek meaning of reason or logos

It used to be that a good life meant a life lived in accordance with the standards of reason—a conviction which obviously rests on the premise that reason has substantive standards of its own in the first place. 

Consequences of Instrumental Thinking

For Horkheimer, one of the chief dangers that the instrumentalisation of reason posed was that modern society would eventually come to forget the intellectual roots of many of the ideals and values it takes for granted, such as justice, equality, liberty, democracy, etc., thereby creating the opportunity for their destruction.

This is because subjective reason has no real grounds to claim whether it is rational or irrational to prefer injustice over justice or oppression over freedom since these claims are scientifically unverifiable and reason has no intrinsic preference of its own. 

Man obeying law
Vecteezy/Titiwoot Weerawong

Perhaps the most extreme consequence of this pure instrumental thinking can be found documented in Hannah Arendt’s infamous report on Adolf Eichmann’s trial in Jerusalem. There Arendt put forward what is widely considered her most scandalous thesis, namely that what Eichmann—one of the key figures in the Nazi’s Final Solution—was truly guilty of was not so much personal genocidal intentions but much more his utter failure to think about the consequences of his own actions.

Clearly, what had struck her during her witnessing of Eichmann’s trial was his pathetic defence that he was just “obeying orders” as a lawful citizen and that he had only a circumscribed role to play in a large-scale process that exceeded him. To be sure, Eichmann did not carry out the Führer’s orders without thinking; we can be confident that plenty of meticulous thought went into planning how the Nazis were going to identify and transport millions of Jews to the extermination camps. And yet, such calculations of logistics, while not trivial, can hardly be called thinking in the proper sense if thinking is to be regarded as a worthy activity at all.

For Arendt, thinking means reflecting on one’s own actions, judging right from wrong, and thinking in the place of others. It is in this sense that Eichmann’s actions were completely thoughtless. And it was precisely this kind of banal thoughtlessness practised on a nationwide scale, according to Arendt, that resulted in one of the greatest atrocities committed against humanity. 

From Rational Being to Economical Being?

In many ways, Horkheimer’s critique of instrumental reason all those years ago nonetheless remains relevant when it comes to analysing our predicament today. There is a case to be made that reason since then has gradually become even more instrumentalised, particularly under the sway of neoliberalism. We humans have always distinguished ourselves from animals by our capacity to think and reason.

We take pride in calling ourselves Homo sapiens, meaning wise hominin. And yet, today it seems that being rational has very little to do with being wise. Rationality has since acquired a new meaning: being rational today means being economical—indeed, we have all evolved into Homo economicus.

Asian familiy

Accordingly, rationality is no longer considered the object of study of the philosopher but of the economist. Recall the ambitious neoclassical definition of economics supplied by Lionel Robbins:

The science which studies human behaviour as a relationship between ends and scarce means which have alternative uses.

An Essay on the Nature and Significance of Economic Science: economics, Lionel Robbins

What’s incredible about this definition is that it attempts to define economics without any reference to the typical mechanisms of production, exchange, and consumption. Economics is now the study of rational human behaviour—rational in the sense of maximising utility given constraints and human in the sense of being Homo economicus.

By casting its net as widely as possible, neoclassical economics effectively legitimises the extension and generalisation of its mode of analysis to encompass any rational conduct, including non-economic activities such as healthcare and education.

As a result, not only can we apply an economic lens to analyse sectors such as education, but we also gradually come to perceive those sectors as being essentially economic activities.

We start to see education as a form of “investment” whose goal is the formation of “human capital.” We start to see schools as vehicles of social mobility, subjects as career paths, and academic skills as an individual’s means of production. 

There is a prevailing sense today that although more people are receiving a proper education, fewer people turn out to be highly educated, or at least, the two have not grown in proportion. And this has less to do with the massification of educational institutions and more to do with how we conceptualise the purpose of education in our culture.

Malaysia skyline

On this point, let us confine ourselves to one example: we’re all aware that in many countries (including Malaysia), the perceived value of the humanities has been in steady decline. Putting aside the practical motivations for steering students towards STEM subjects and away from the humanities, there is a more fundamental reason as to why the humanities are not taken as seriously as the sciences: namely, people no longer find it meaningful to attribute a “what” to thinking.

If thinking is only sufficient for determining means, and if ends are motivated only by personal factors, then the function of the humanities in education—the elevation of ends which complements the equipment of means—becomes obsolete. It is a waste of time to try and deepen one’s appreciation of literature, or to speculate about the good life, or to heighten one’s historical awareness if tastes, values, and meaning are ultimately subjective and arbitrary. As anybody’s opinion is just as valid as anybody else’s, there is no good reason to change one’s perspective on anything, much less in bettering it.

Read more on the values of Philosophy and Critical Thinking to Malaysians here.

What Really Is Critical Thinking?

Critical Thinking vs Clever Argumentation

Taking all of the above into consideration, I suppose this gives us sufficient grounds to be sceptical of the uncritical insistence that thinking is a skill, or, at the very least, to question the supposed neutrality of such an assertion.

We’ve already run through very casually the consequences that the instrumentalisation of thinking has for society at large; what is left for us to consider now are its consequences for thinking itself. What becomes of critical thinking in the context of this instrumental-subjectivist attitude?

The answer is that it gets assimilated to clever argumentation, and with this comes the immanent danger of sophistry. To be fair, sophistry, in reality, shares a great many similarities with critical thinking—more than people like to admit—which makes distinguishing the two a rather challenging task. But the stakes are high, I contend, since it seems to me that many of today’s critical thinking coaches are more aptly characterised as modern-day sophists.

The term “sophistry” comes from a group of itinerant intellectuals in the fifth century BC known as “the Sophists.” But the term wasn’t always used as a pejorative. In fact, most of the negative connotations we associate with sophistry today, such as the deliberate use of fallacious reasoning, deception, and trickery to make the weaker argument seem stronger, stem directly from the criticisms made by Plato and Aristotle, who regarded the Sophists as the sworn enemies of all authentic philosophy. 

Back in Ancient Athens, the Sophists were mainly known as professional instructors who trained citizens in the art of public speaking and debate in exchange for a fee. Recall that Ancient Athens was a direct democracy (unlike ours which is representative) and that all free citizens were expected to take an active part in political and judicial affairs.

This created a strong demand for sophistic instruction that could help regular Athenians argue persuasively in front of a critical audience. Persuasion was key—its compulsive force in a large gathering such as the ekklēsia far outstripped that of cold sober reasoning. Argumentation has always had a theatrical element to it, and thus, rhetoric became the most valuable trade the Sophists could teach.

Greek statues

Despite common usage, sophistry—or perhaps more accurately, “sophistics”—is not so much about twisting what is true and convincing others of what is false by means of calculated manipulation. If we’re being charitable here, sophistry is a pure means. The Sophists taught methods of argument and rhetoric which could then be adapted for whatever ends an individual saw fit to pursue. Sophistry is all about the “how,” not about the “what.” What is right and what is wrong is not the Sophists’ business. Their craft consists solely of methods of evaluating arguments and refuting them effectively, as well as ways of convincingly demonstrating the superiority of one’s own position over that of one’s opponents through the use of various logical and rhetorical tactics. 

Evidently, the problem with sophistry is that, compared to critical thought, it is unreflective.

Reasoning becomes mere rationalisation in the hands of the Sophist, and truth a synonym for victory. In this sense, we could say that “sophistry” is somewhat of a misnomer since it is more about cleverness than actual wisdom (sophia). And while there is a certain charm to cleverness, it quickly wears thin on people the moment they realise it’s all vanity without any real insight. 

Critical Thinking Is Not Merely A Skill

This is why I contend that thinking is not a skill and should not be construed as one. But let me be precise here: I am not saying that there are no skills involved in thinking. This would again be confounding “how to think” with “thinking is a how.” I am well aware that thinking involves many skills—analysis, interpretation, questioning, reasoning, etc.—and yet, my claim is that thinking is irreducible to them in the final analysis.

A valuable analogy can be drawn here between thinking and reading. Likewise, there are many skills involved in reading—word awareness, fluency, comprehension, etc.—but reading itself is not a skill, this being contrary to what many educators believe. It is not a skill because true reading hasn’t so much to do with whether or not I understand the text or whether or not I can extract and manipulate the information contained within—such is reading under examination conditions which could not be more artificial.

On the contrary, reading is about opening ourselves up to the thoughts and experiences of others, not about withdrawing into the comprehension of the self. It is not about comprehending everything the author says but precisely attending to those things we don’t yet comprehend. To reduce reading to comprehension then—which is exactly what school does (the reading paper is literally called the comprehension paper)—is thus to foreclose this dimension of otherness entirely, which in practice has the effect of sucking all the joy and serendipity from reading as an activity.

The same can be said for thinking. Being good at thinking has nothing to do with winning debates over others or with claiming intellectual superiority over others—what Plato called “eristics.” It is not judged by one’s ability to argue for any viewpoint persuasively, much less by one’s ability to refute any argument hurled one’s way—what Plato called “antilogic.” Thinking is not about being clever. Thinking is not argumentation. What is thinking, then? Perhaps Plato can offer us a clue.

Critical Thinking Involves Dialectic

For Plato, what distinguishes philosophy from sophistry is that the philosopher practices the dialectic. It is unfortunate that the concept of dialectic so often gets assimilated to debate, considering that it constitutes the way out of ceaseless combative argumentation. This isn’t to suggest that debate is somehow an unworthy activity for thought, however. I’m aware that many educators are now trying to incorporate debates into their lessons as a means of stimulating critical thinking.

And I am in full agreement with them so long as we stress to students the mere preparatory character of debate; otherwise, we’re just encouraging sophistry. Though it may seem contrary to what I’ve said earlier, I can acknowledge a certain didactic value in teaching students how to deconstruct arguments from both sides and to think in the shoes of people they personally disagree with. For someone who has not yet begun to think independently, it is essential that we jolt them out of their complacent beliefs by forcing them to confront contrary and conflicting viewpoints.

two people talking

The goal is to induce a state of aporia—that is, a state of puzzlement brought about by the equal strength of contradictory viewpoints. Every debater knows that, with enough skill and wit, a convincing case can be made for either side. But in learning this lesson from experience, one no longer allows oneself to be impressed by mere argumentation. This is when one properly begins to think. The error here is to suppose that aporia is the dead end of thinking, that it is unthinkable, when it is really its true beginning.

Aporia, contradiction, is not what stops thinking in its tracks; quite the contrary, as Hegel knew better than anybody else, contradiction is precisely what drives the movement of thought forward, it gives thought something to think through. If there is any value in learning debate, it is ultimately so that one can go beyond positing one-sided arguments and internalise this contradictory spirit within oneself. Thinking now learns to posit its own opposition in order to arrive at more nuanced insights, and in so doing, it transcends mere clever argumentation. 

Participate in Malaysian Philosophy Society’s community for regular Thinkers Cafe discussions using a Socratic dialogue method. 

Critical Thinking Is An Activity

So, what is critical thinking? In the final analysis, thinking reveals itself to no longer be a skill but an activity. Why “activity” over “skill”? This question may seem to suggest that, in the end, our problem reduces to a mere semantic dispute—a question over the meaning of words that is itself meaningless. But the fact is, our choice of words betrays a lot about how we implicitly conceptualise things.

By calling thinking a “skill,” we ask others to conceive of it as essentially an instrument for determining sufficient means towards given ends or for constructing arguments in favour of or in opposition to given stances.

A skill is, after all, an acquired ability to perform a specific task proficiently. While this conception does contain some element of truth, nonetheless, it constitutes only a part of the whole picture.

a group of people talking

As we showed earlier, abstracting this part and treating it as the whole leads to a conception of thinking that is indistinguishable from mere calculation and sophistry. What this reductive account fails to appreciate are the subversive and transformative potentials of thinking —and critical thinking especially—which can be accounted for only when we regard thinking as essentially an activity.

To be clear, activities and skills are not mutually exclusive; again, to repeat my formula: there are skills involved in an activity, but the activity itself is not a skill. In the case of thinking, what holds greater significance is not an individual’s ability to calculate or argue—this is not a question of competence—but much more their willingness to reflect upon the ends and stances they pursue.

But, most importantly, what counts is their openness to otherness. The purpose of engaging in the activity of thinking is not to satisfy individual ends, but instead to dialogue with other people, to encounter other ideas, and to experiment with other perspectives.

For, it is only through engagement with otherness that we can come to transform our personal worldviews, to subvert our natural positions, and to become critical thinkers by being critical of ourselves.

*Disclaimer: This article has been edited for clarity. The views and opinions expressed in this article are those of the author and do not necessarily reflect the stance of the Malaysian Philosophy Society.