The Shortcut That Skips Thinking
Everyone wants shortcuts. That’s nothing new. There have always been “get rich quick” schemes and homework services. However, the rise of generative AI (GenAI) has introduced a new kind of shortcut, one that skips the thinking altogether. We are seeing a growing trend of overreliance on AI tools like ChatGPT, Copilot, and Midjourney in everything from coding and writing to spiritual reflection. As someone who works in the AI space professionally and reflects on theology personally, I’m no stranger to these tools. The more I use them, however, the more I see the need for discernment in how we use them.
From TI-83 to ChatGPT: Garbage In, Garbage Out
There’s an old phrase I first heard in high school math class, when my teacher warned us about over-relying on the then-revolutionary TI-83 graphing calculator: garbage in, garbage out. It still rings true today, both in the Army and in the cloud. No matter how advanced the system, the quality of the output is only as good as the quality of the input, and the intentions behind it. That’s especially true with GenAI. These tools sound smart, but they aren’t thinking. They echo patterns. They remix the past. It’s like the difference between testing a trading strategy on historical data and wondering why it fails in real time. If we’re not doing the deeper work, asking why, evaluating truth, wrestling with context, we risk mistaking fluency for understanding.
What the Research Says About Overreliance on AI and Learning
That danger becomes even clearer in education. According to a 2025 systematic review published in the International Journal of Engineering Pedagogy, 71.4% of studies reported positive effects of generative AI on self-regulated learning (SRL), especially when tools like ChatGPT are used for personalized feedback, metacognitive prompts, and adaptive learning guidance. Students learned to take more control of their educational journeys, a major win in an era of constant digital distractions. The same review, however, also issued a warning: overreliance on AI can diminish the very independence it’s supposed to support. Some students began letting the tool think for them. Their ability to reflect critically and solve problems on their own took a hit1.
The report also found that 62.5% of studies linked generative AI with improved critical thinking skills, thanks to AI’s ability to provide reflective feedback and present multiple perspectives. But again, the benefits hinged on how the tools were used. Passive engagement led to cognitive atrophy. Structured, intentional use, especially within the Hybrid Human-AI Regulation (HHAIR) framework, preserved learner autonomy and turned AI into a partner, not a crutch.
That nuance matters far beyond academia. The current director of my organization once said, “If you’re not using these tools to help in your coding, not only are you in the wrong place, but you’re simply wrong.” And he’s right, to a point. If you’re in tech and not at least experimenting with AI, you’re falling behind. But if your entire process becomes: prompt, copy, paste, well, you’re just not developing skills. You’re becoming dependent on a system that can’t explain itself, can’t reason, and doesn’t know why it does what it does.
Always Ask Why: A Framework for Discernment
One of the best pieces of advice I received came from the director of the executive education program I attended at Carnegie Mellon: “Do not be easily convinced. Always ask why?” That mindset has stuck with me. It’s foundational, not just for working in AI, but for navigating a world increasingly shaped by it. The more convenient something becomes, the more critical it is to question what’s beneath the surface. In my own work, it has prevented the propensity to run around in circles. It is important to know when to turn to GenAI, and when to do your own research. The chat bot doesn’t know the deeper requirements behind why you’re asking it a question, and if you don’t have a firm grasp on what your doing, you may find yourself using a tool that doesn’t meet those requirements and having to refactor later. I’ve fallen into that trap myself, so I know it’s a big one.
Use the Tools: Don’t Lose Yourself
This isn’t a call to ditch AI. It’s a call to use it responsibly. To view it the way a craftsman views a chisel: useful, even beautiful in the right hands, but useless without vision. In a world racing toward automation, intentionality is the most human trait we can preserve.
So yes, use the tools. Explore their power. But never forget who’s holding the chisel. And never stop asking the hard questions, because that’s the difference between a child and a master.
If this resonated with you, or challenged you, I’d love to hear your thoughts. How are you using (or resisting) AI in your own work or studies? Drop a comment below or reach out through the blog. Let’s keep the conversation going in.
- Sardi, J., Darmansyah, Candra, O., Yuliana, D. F., Habibullah, Yanto, D. T., & Eliza, F. (2025). How Generative AI Influences Students’ Self-Regulated Learning and Critical Thinking Skills? A Systematic Review. International Journal of Engineering Pedagogy, 15(1), 94-108. doi:https://doi.org/10.3991/ijep.v15i1.533799
Leave a Reply