There’s a certain panic in the air right now that feels familiar.
Everywhere you scroll: "Don’t watch Netflix tonight. Watch this one-hour video and finally learn AI." As if AI were a language you could casually pick up between episodes.
You can’t.
AI isn’t something you learn. It’s a shift in how you think. Once it clicks, there’s no going back. You stop "using" it and start thinking with it, and suddenly doing anything without it feels unnecessarily manual.
We’ve seen this before. When desktop publishing arrived, there was a mild panic that real design was over. That taste would disappear. That anyone with software would suddenly think they were a creative director.
They were half right.
Technology doesn’t give you taste. It exposes whether you had any to begin with.
For me, AI clicked because of how my brain already works. I was a dancer, yes, but cognitively I’m a database. I like systems. I like logic. I like the clean satisfaction of "if this, then that." AI didn’t change that. It accelerated it.
And then I built an app.
GoShed just got approved in the Apple App Store, which sounds celebratory. It was neither. It was messy, slow, occasionally maddening, and had very little to do with "learning AI."
What it actually required: logic, reasoning, imagination layered on top, and a slightly unhinged level of patience.
One wrong comma, one stray quotation mark, and everything breaks. I also learned that programming languages have very strong feelings about quotation marks, which feels like a personality flaw.
The people getting real value out of AI right now aren’t the ones collecting prompts. They’re the ones doing the work. Asking better questions. Pushing past "that’s not possible." Staying in it long enough for something useful to happen.
Meanwhile, culturally, AI has entered its performance era.
This week alone: a plane to Beijing loaded with AI power players, sparking a guessing game about who made the list and who didn’t. At home, the red carpet has merged with the prompt box. Reese Witherspoon, Anne Hathaway, Demi Moore, and a growing roster of public figures are announcing they "can’t live without" AI.
Even Mel Robbins, whose brand is built on radical transparency, briefly forgot to mention she was being paid to promote Microsoft’s Copilot. Oops. (Of course, Tony Robbins got there first.)
Informercial: celebrities are lining up to hold the product. I know the PR playbook: talking about AI signals relevance. It’s career insurance dressed up as enthusiasm.
And then there’s Martha.
Martha Stewart just announced a new AI company with $10 million behind it. Which sounds impressive until you remember: Martha doesn’t need $10 million.
What she actually brings is something far more valuable than capital: permission.
You can almost hear the boardroom logic. The product exists. The technology works. What it needs is a face people trust. Not just an influencer, but a translator. Someone who makes the unfamiliar feel domestic. Someone who signals that this belongs in your home.
Because if Martha likes it, maybe everyone else will too. That’s not trivial. That’s distribution.
But it’s also where things get uncomfortable.
Data centers. Energy. Scale. Those don’t resolve themselves.
Because while AI is being packaged, promoted, and polished for mass adoption, the harder questions are still sitting there. Not just environmental impact, though that’s part of it.
It’s bigger than that. Responsibility in how these systems are built. What they’re trained on. How they’re used. Who benefits. Who doesn’t. What gets automated away, and what disappears with it.
That part hasn’t had its glossy rollout yet.
It will. The same people lending their names and platforms now will, inevitably, be asked to stand behind something more substantive than access.
Or at least, they should be. Because AI isn’t something you learn. It’s a shift in how you think. And once you start thinking with it, you see both sides more clearly. The leverage, and the cost.
I felt that building GoShed. Not in some abstract, philosophical way, but in the actual process. Logic, reasoning, imagination, patience. One wrong character and everything breaks. You don’t get to fake your way through it. You have to engage.
That’s the part no one can package for you.
The real divide right now isn’t between people who "know AI" and people who don’t. It’s between people who are willing to sit inside the friction of it and people who are still looking for the shortcut.
There isn’t one.
Connie
P.S. I’m building something around the harder questions. It’s called The AI Trust Collective. If that’s interesting to you, give me a holler.
