Bias Towards Action is the Most Valuable Skill in the AI Era
Why bias towards action has become the most valuable professional skill in the age of AI - and why it matters more than your technical background.
The company retreat last week was in Huntington Beach. Great weather, nice hotel, and a lot of conversations about AI. Some of it was about how we use AI internally to build software faster. Some of it was about AI features in the product itself. All of it was very forward-looking.
Meanwhile, I’ve been quietly shipping things.
Over the last few weeks, I’ve been delivering projects outside our normal sprint process: stuff that needed to happen but didn’t fit neatly into the usual planning cycle. And the pattern I kept noticing was that the people who got things done weren’t necessarily the most technical or the most experienced. They were the ones who just started.
The Old Calculus
There’s a reason we built cultures of deliberation. When implementing something was expensive (weeks of engineering time, complex deployments, careful coordination across teams) it made sense to think hard before committing. Get everyone aligned. Validate the requirements. Make sure you’re building the right thing before you build it, because the cost of being wrong was real.
So we built process around that. Sprints, planning sessions, stakeholder alignment meetings. Not because people love meetings, but because the overhead of consensus was lower than the cost of rework. The math worked out.
That math has changed.
The New Calculus
With AI assistance, the cost to produce a working prototype has collapsed. Something that used to take a week of careful engineering can now take an afternoon. The cost of being wrong is much lower, because you can just make the thing and see if it’s right.
This fundamentally shifts the optimal strategy. If you can show someone a working demo faster than you can finish explaining your proposal, why are you still explaining?
I’ve watched this play out repeatedly. I’d get a vague sense that something needed to exist. Instead of scheduling a meeting to align on requirements, I’d build a rough version. And when I showed up with working code, the conversation changed completely. People stopped debating whether to build it and started talking about what to do next.
Because here’s the thing: they just wanted it done. They didn’t really care about the implementation discussion. They wanted the thing.
Turns out, working code is a better requirements document than any spec.
AI Fills the Gaps
Here’s the other thing I’ve noticed: AI doesn’t just make implementation faster, it fills in skill gaps. You don’t need to know every library or tool. You don’t need to be an expert in whatever technology the project calls for. You can start building in an area where you have moderate competence and let AI handle the parts where your knowledge runs thin.
This means the limiting factor isn’t technical skill anymore, at least not in the way it used to be. The people who are succeeding right now tend to be the ones who just decide to go. Who treat “I’m not sure how to do this” as a starting point instead of a stopping point. Their original discipline almost doesn’t matter since AI smooths out enough of the gaps that the thing still gets built.
Bias towards action used to mean being willing to take on risk. Now it also means being willing to use tools that let you derisk dramatically, and still choosing to act anyway.
Back to Huntington Beach
To be clear, the conversations at the retreat were interesting. The honest reality is that nobody has a clean strategy for AI right now. It’s genuinely hard to know what’s coming or what the right choices are. There are a lot of smart people with a lot of different opinions, and the landscape keeps shifting.
But that uncertainty is actually the best argument for just starting. When you don’t know what the right path is, making forward progress is how you find out. You ship something, learn from it, adjust, and ship again. The organizations and individuals who are iterating are going to get there faster than the ones who are waiting until things are clearer (they won’t be).
Let me know if you’ve seen the same pattern. I’m curious whether this maps to other people’s experience or if I’m just in a particular corner of the industry where this is playing out.
Comments
Let me know what you think!