
Critical thinking is a cornerstone of effective leadership, problem-solving, and innovation, but what exactly is happening in your brain during the process? At the most basic level, critical thinking is the ability to analyze, evaluate, and synthesize information to make reasoned decisions. It’s both deliberate and sustained, making use of various cognitive processes such as problem-solving, decision-making and reflective thinking—the polar opposite of reacting impulsively or relying on your gut instinct.
People approach critical thinking in many different ways based on their personal preferences, goals, or the nature of the problem they're trying to solve. Some swear by productivity expert Cal Newport’s “deep work” concept, which includes extended periods of distraction-free thinking. Similarly, Andy Tryba's marble method rewards your mind for completing 30-minute blocks of intense focus on a single task. Regardless of your method, the common thread is time. True critical thinking is a sustained and deliberate effort, and you could argue that this skill is more vital than ever in our quickly evolving and increasingly AI-powered world.
With AI permeating almost every layer of work, personal life, and education—and its ability to provide immediate feedback through a near-personification of intelligence—many people are beginning to over-rely on it. I have personally started to rely on it heavily for coding tasks, especially those that would have required hours of deep work that I now simply get cognitively “for free” from tools like Claude, GitHub Copilot, and Crowdbotics.
This got me thinking: Is AI making us “dumber?” Is our ability to think critically slipping away as we sit back and let AI do the dirty work for us? There’s growing research and evidence showing that this cognitive offloading does impact critical thinking that requires active cognitive engagement to analyze, evaluate, and synthesize information.
Cognitive offloading is simply the act of using an external force to reduce the cognitive load on your working memory. This can be as simple as tilting your head to see an image better or using a neural network of over 200 billion parameters to create a blueberry muffin recipe. We all do it. It’s human.
What we're losing in the pursuit of speed and instant answers
When we look at what AI might do (or is doing) to our brains, it's important to remember that this isn't the first time technology has reshaped nearly every aspect of human life. In 2011, a group of researchers investigated the impact of Google’s ability to provide instant information access. They found that when people expect information to remain continuously available (such as we expect with internet access), they are more likely to remember where to find it than to remember the item's details.
Like many of you, when I was young, information wasn’t instantly accessible. If you wanted to learn more about something, you’d go to the library, rummage through the card catalog, locate the section, find the aisle, search for the book, read the table of contents, flip to the chapter, and then read. If you didn’t want to do that, you’d simply not know the answer—which actually is kind of liberating to think about now, simply not knowing.
Today, the epic quests required to find answers have long abated. From Google to ChatGPT, we’re becoming natural cyborgs, symbiotic with our computer tools, where it matters less about knowing the information than knowing where or how to find it.
But with this undeniable convenience and speed, what is the trade-off?
Is AI fundamentally changing thinking?
There may be a knee-jerk reaction to see AI’s influence on critical thinking as a serious warning or sign that we're outsourcing too much of our mental load. But maybe this is the wrong question. Would a decline in deep thinking be an unfortunate consequence, or is the shift inevitable? And is this necessarily a bad thing? Cognitive load theory says no(ish), while others say yes.
Every major technological leap, from the printing press to the internet, has been accompanied by a fear that it would dull human intellect. In fact, this line of thinking can be traced all the way back to Socrates, who showed concern that a reliance on writing would weaken humans' memory and genuine understanding. Ultimately, however, each innovation eventually paved the way for new forms of progress.
Let's dissect a real-world example from both sides. In the real world today, AI is heavily used in data analytics, where models can analyze massive data sets to identify patterns, predict trends, and filter out noise. On one side of the coin, this is great because it offloads a lot of cognitive work, allowing humans to see correlations and make decisions. The other camp would say this reliance on AI reduces humans' ability to do in-depth independent analysis.
So is AI making us dumber?
The answer is maybe, with a hard lean toward yes. Cognitive offloading, while initially liberating mental resources, may ultimately diminish our intellectual capabilities. Some scholars warn of "cognitive laziness" developing as people become less inclined to engage in thorough analytical thinking. Delegating memory and decision-making functions to AI systems will gradually erode our ability to perform these mental tasks independently, potentially compromising cognitive adaptability and resilience.
Our extended dependence on artificial intelligence for cognitive support will weaken fundamental mental faculties, including memory, analysis, and problem-solving capabilities. This sustained and growing outsourcing of cognitive functions will lead to atrophy of internal mental processes, possibly leading to deterioration in long-term memory function and overall cognitive well-being. Time will tell.
But is this a bad thing? AI may accelerate "cognitive laziness" by today's standards, but it doesn’t mean we’re worse off. Not long ago, simply using a calculator was deemed to be “lazy,” but now you carry one with you everywhere you go (your phone). As with new technology, humanity grows and transforms to make the best use of it. I think the difference with AI, however, is the speed at which it’s impacted us.
The question you have to ask yourself is: is a decline in cognitive skills a price worth paying for unprecedented convenience and global progress? Or is it possible that AI isn’t diminishing our critical thinking but evolving it? It could be that the ability to leverage AI-generated insights, adapt to rapid changes, and discern truth from misinformation will become the new benchmark for human intellect.
Ultimately, it isn’t about whether we’re losing old forms of thinking as AI infiltrates so many aspects of life—it’s whether we’re ready to embrace the new ones.
The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.
Read more:
- AI will take your job. Get over it
- Rigid work models won’t survive AI. Here’s what will
- AI will reshape industries by changing not just the ‘how’ of work, but also the ‘who’