top of page

Will AI make us dumber or more efficient?

  • May 1
  • 4 min read

Updated: May 9

Written by Dr Gideon Fadiran, Dr David Fadiran


Artificial Intelligence (AI) has quickly gone from a futuristic concept to an everyday tool, powering our searches, drafting our emails, curating our content, and even making decisions for us. But as it becomes more integrated into how we work, think, and communicate, a deeper question emerges:


Is AI making us smarter, dumber, or just more efficient?

The simple answer? It’s complicated. Because it’s not a binary choice. AI is undoubtedly making us more efficient, but in doing so, it might also be reshaping how we acquire and retain knowledge, possibly to our detriment.


Efficiency Is Undeniable

Let’s start with what’s clear: AI is helping people get things done faster. From writing code to summarizing research to crafting professional emails or learning new skills, AI drastically cuts down the time and effort it used to take to complete tasks.


In the past, if you wanted to understand a topic you were not familiar with say, climate policy or neural networks, you would have to read articles, review papers, maybe watch a few long-form lectures, and piece together an understanding. Now, you can type in a single well-crafted prompt and get a clear, concise explanation, or even a publishable article, without ever truly understanding the depth of the subject. That’s efficiency. But it raises a critical concern.


The Risk of Shallow Understanding

This new form of productivity doesn’t necessarily equate to understanding. It’s entirely possible to produce high-quality content with only a surface-level grasp of the subject matter. In fact, it’s becoming more common.


Imagine being tasked with writing an article or delivering a presentation on a topic you’ve never studied. With AI, you can now generate a compelling, informative-sounding piece, despite having barely scratched the surface of the material yourself. In a professional setting, this may help you hit your deadline. But over time, it chips away at the intellectual rigor that used to accompany such tasks.


What we are seeing is the rise of a culture of “doing” without necessarily “knowing.” And while that boosts short-term output, it might reduce our long-term intellectual capital.


Learning: Faster, But Possibly Shallower

To be fair, AI has genuinely accelerated the learning curve for those who are keen to learn. It can consolidate and present relevant information far more quickly than traditional methods. Learners who are motivated can absorb and synthesize knowledge at a faster pace than ever before.


However, for those more focused on simply completing tasks or reaching objectives, AI offers shortcuts. In the past, delivering results often required deep engagement and “learning by doing.” Today, that learning component is often bypassed. Tasks can be executed with surface-level input and minimal subject mastery.


This may lead to a gradual reduction in foundational knowledge across fields, as people become more task-focused and less concerned with understanding the "why" behind the work.


When AI Becomes the Source and the Student

Another concern arises when AI is not only delivering information but also learning from itself. If human users stop referring to original texts, academic literature, or primary data, and instead rely solely on AI summaries, then AI begins feeding off its own interpretations. This creates a feedback loop, where flawed or shallow outputs become the new training data.


Now, if misinformation or false assumptions are introduced into this loop, and millions of users begin consuming and acting on that content without verifying sources, the spread of false knowledge becomes faster and more efficient than ever.


“If there’s misinformation that’s been fed into AI, and every other person is learning based on that misinformation, then the process of learning the wrong information is much faster because people are not necessarily referring to the original texts that inform the subject.”


This makes the challenge even more urgent. What happens when the dominant source of knowledge no longer has an external benchmark? When AI references itself, who checks the facts?



The Death of “Learning by Doing”?

Traditionally, people gained knowledge through the process of trying, failing, iterating, and finally succeeding. Whether it was building a website, writing a report, or analyzing a dataset, the journey was the teacher.


Now, many of these tasks can be automated or semi-automated. AI can write the code. AI can do the analysis. AI can edit the text. The human becomes a project manager, not a practitioner.


As a result, people are reaching their goals without going through the full learning curve. This may save time, but it also flattens the experience and limits the development of deep expertise.




A Shift in Intellectual Growth?

All of this leads to a sobering thought: while AI is increasing short-term efficiency, it may be slowing down long-term intellectual growth.


As reliance on AI deepens, the rate at which people truly become knowledgeable might decline. Over time, we could see a growing gap between people who develop genuine expertise and those who merely know how to use AI to simulate it. The concern isn’t that people will become “dumber” in a literal sense, but that our collective depth of understanding may start to erode.


And in a world that increasingly rewards fast answers over thoughtful reflection, that erosion could have profound consequences.


Final Thoughts: Rethinking the Question

So perhaps we need to reframe the original question.

It’s not about choosing between being “dumber” or “more efficient.” Clearly, AI is making us more efficient. The more important question is:


Are we trading depth for speed, and if so, at what cost?

AI is a tool. It can accelerate both learning and ignorance. It’s up to us to decide which path we want to take.



 

1 Comment

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Unknown member
May 01
Rated 5 out of 5 stars.

Amazing, conversations we need to be having. Key points touched.

Like
bottom of page