I've been thinking a lot about the "AI makes us dumber" narrative that keeps popping up in my feeds. You know the one – the hand-wringing articles about how we're losing our ability to think, write, or solve problems because we're leaning too heavily on artificial intelligence.
Here's the thing: I think we're asking the wrong question entirely.
The Woodworking Problem
When you first start woodworking, you make a bookcase by nailing pieces of wood together. Maybe you throw in a couple of 45-degree angle braces if you're feeling fancy. You paint it, it works, and you're proud of what you built.
As you get more experienced, you learn about rabbit joints and dados. You realize that sliding boards into cut grooves and then fastening them creates something much stronger. Later, you discover that adding a kicker plate underneath makes the whole structure more rigid. Eventually, you learn that using a card scraper instead of sandpaper gives you a much nicer finish.
Now imagine if you had decided early on that nailing boards together was the "right" way to build furniture. Even if you became the absolute best at nailing boards together – the fastest, most precise board-nailer in the world – that doesn't mean nailing was actually the best approach.
This is exactly what's happening with the AI debate. We're measuring AI's impact against our current ways of doing things, believing those methods were optimal to begin with. In many cases they were. But what if they weren't?
The False Baseline
The fundamental flaw in the "AI makes us dumber" argument is that it assumes we were doing things correctly before AI came along. It treats our existing approaches as the gold standard and measures any deviation as a decline.
But think about it: most of our current methods for writing, coding, research, and problem-solving evolved under specific constraints. We developed these skills when information was scarce, when collaboration was limited by physical proximity, and when the cost of iteration was high.
AI removes many of these constraints. When I watch people with no prior experience in a discipline pick up AI tools and discover entirely new approaches to solving problems, I don't see someone getting dumber. I see someone freed from the accumulated assumptions and habits that the rest of us carry around.
They don't have the hangups that come from building a career around doing things a certain way. They're not invested in defending methods they spent years mastering. They can just ask: "What's the best way to solve this problem right now?"
My Experience: Getting Better, Not Dumber
Despite what I just said about fresh perspectives, I'm actually doubling down on the tools and languages I know best. I'm using AI to get dramatically better at Python, Django, HTML, and the stack I've been working with for years.
I can now build in a few hours what would have taken me weeks before. I'm not just copying and pasting AI-generated code – I'm using AI as a sophisticated pair programming partner that helps me implement ideas faster and explore possibilities I might not have considered.
When I compare this to no-code tools like Lovable or Supabase, I consistently find that I can build better solutions faster using Cursor (my AI-enhanced code editor) with the technologies I already understand. The AI amplifies my existing knowledge rather than replacing it.
My brain feels more engaged, not less. The possibilities feel boundless. The constraint has shifted from "can I build this?" to "when will I build this?" and more importantly, "how do I stay human while building it?"
The Real Challenge: Staying Human
And that's where we get to the actual problem. It's not that AI makes us dumber – it's that AI might make us less human.
When you can suddenly be this productive, when you can build and create at this pace, it becomes incredibly easy to get sucked into a hyper-state of productivity that will absolutely rot your soul.
I can feel the pull. When I'm in flow with AI tools, building and iterating and solving problems at speeds that feel almost magical, hours disappear. The temptation is to keep going, to build just one more feature, to solve just one more problem.
But what happens to the things that actually feed your soul? The time in the woods, the conversations with your family, the moments of genuine rest and reflection? What happens to the parts of being human that have nothing to do with productivity?
Finding the Balance
I'm still figuring this out, but here's what I've learned so far. The solution isn't to use AI less – it's to be more intentional about everything else.
I've had to create absolute stop times. No work on weekends, period. Take an afternoon off in the middle of the week. Be done by 6 PM every night, no matter what's happening with a project. Go to the gym, no matter what. Spend even more time outdoors than before.
These aren't just nice-to-haves anymore. They're essential guardrails against losing connection with what keeps me whole and interested and excited about the work I do when I return to it.
The Questions We Should Be Asking
Instead of "Is AI making us dumber?" I think we should be asking:
How do we use AI to become better at the things we're already good at?
What new approaches become possible when we remove old constraints?
How do we maintain our humanity while dramatically increasing our capabilities?
What parts of being human do we need to protect and nurture more intentionally?
The woodworking analogy works here too. A master craftsperson doesn't just use better tools – they understand when to use each tool and, critically, when to step away from the workshop entirely.
Looking Forward
I'm not worried about AI making us dumber. I'm excited about what becomes possible when we combine human intuition, creativity, and judgment with AI's speed and capability.
But I am concerned about what we might lose if we don't think carefully about how to integrate these tools into lives that remain fully human. The challenge isn't technological – it's philosophical and practical. How do we amplify our capabilities without losing ourselves in the process?
The answer, I think, lies not in rejecting these tools but in being more intentional about everything else. In protecting the spaces and relationships and experiences that have nothing to do with productivity. In remembering that the goal isn't to build more or faster, but to build better lives.
Because at the end of the day, the best woodworker isn't the one with the most sophisticated tools – it's the one who knows what's worth building and why.