One Drink at the AI Party
When you show up at the AI Party, what did you come for? Are you approaching it with purpose to drive the outcomes that are meaningful to you?
I recently came across a post by Lara Aigmüller titled Why I am leaving the AI party after one drink. Her experience mirrored mine closely enough that I felt compelled to respond.
Lara spent two weeks with Claude Code, building a personal app project she'd been carrying around for six years. She was impressed in places. The repetitive scaffolding work, color palettes, auth forms, layout components, all of these the AI handled well. She also noticed the rough edges: generated CSS that worked but wasn't elegant, tech stack recommendations that didn't match her own experience, and a subtle pushiness when she tried to go a different direction.
Then she quit. And her reasons were not technical.
She left because of how it made her feel. There was something addictive about it. The effortlessness. The itch for the next prompt. And alongside that, a growing sense that the project would never truly feel like her own. She wanted to learn by making her own mistakes. She wanted to slow down. She didn't want to contribute to accelerating environmental harm.
Those are legitimate values. They're also personal and philosophical, not technical. That distinction matters.
Here's where my experience went a different direction.
I've spent twenty years in cybersecurity and infrastructure. I lead a team delivering complex cloud and compliance solutions to enterprise clients. When I started using AI tools, I had the same flicker of unease. Was I cheating? Was I getting lazy?
What I found was something closer to what I described in The Shrinking Team That Ships More: a junior developer that never sleeps, never complains, and needs constant supervision. The inelegant CSS Lara spotted? You catch that because you know CSS. The questionable tech stack recommendation? You override it because you've shipped production systems and know which scars you're willing to carry.
AI tools amplify the user. In Lara's hands (a developer who values craft and individual ownership) AI felt like it was undermining the things she cared most about. That's a coherent response. I'm not arguing with it.
But my obligation isn't to my own learning journey. It's to my clients' outcomes.
There's also something worth examining in the idea that AI is doing "the work." The work I do doesn't transfer to a prompt:
- understanding a client's regulatory environment
- knowing which architectural tradeoff creates a compliance liability
- designing controls that survive an audit
AI handles the surface area: the boilerplate, the first draft, the scaffolding. My team and I handle the rest. That's not cheating. That's delegation.
Lara ends with a hope to be part of a community that values craftsmanship and honest, high-quality work. I think she'll find it. In my earlier post on why we value handmade things, I argued that overcoming the burden of execution is what acts as a value multiplier to the idea itself. When AI removes that burden for everyone, the people who choose to carry it anyway are saying something. Their work carries a signal that mass-produced software can't replicate.
"No AI Used" will become a meaningful label. Not for the Fortune 500. Enterprise clients aren't buying software because a human agonized over each line. But individual humans often do care about exactly that. There's a real market for what Lara is describing. Think less AWS, more Etsy.
She knows what she came for. That's worth respecting. When you look at the AI Party, what did you come for?
🤖 AIL LEVELS: This content’s AI Influence Levels are AIL2 for the writing, and AIL4 for the images. AI Influence Level (AIL) framework