Purpose Beyond Productivity

There's something quietly ironic happening in the world of AI right now, and I think about it a lot.

The pitch has always been: AI is going to do more for you, so you can do less. It's going to handle the repetitive stuff, reduce friction, solve problems faster, and give you your time back. That's the promise. And in many ways, it's a real one.

But look around. Especially in places like Silicon Valley, or really anywhere people are close to this technology, what do you actually see? You see people running harder. Longer hours, more urgency, more output. An arms race where the tool that was supposed to slow things down has become the fuel for going even faster. Everyone wants to be first. Everyone wants to be ahead. The technology that was supposed to liberate us has, at least for now, become the latest justification for acceleration.

And the data backs this up. Workday's 2026 research found that 37–40% of time supposedly saved by AI gets immediately eaten by reviewing, correcting, and verifying AI-generated output. You save an hour drafting something; you spend 25 minutes fact-checking it because you can't fully trust it. There's even a term emerging for this phenomenon: "workslop", low-quality AI output that creates more cleanup work than it saves. Meanwhile, 89% of managers reported no measurable change in productivity despite rising AI adoption across their organizations. The machines are busy. The needle isn't moving.

I'm not saying that's entirely wrong, or that the technology isn't genuinely exciting, I'm in it too, and I'd be lying if I said I wasn't following the trend. But it does seem worth pausing on. If the dominant narrative around AI is productivity, output, growth, efficiency, the stock market always going up, then we're using an extraordinary technology to do more of the same. And somewhere in there, the human is getting a little more squeezed.

The identity problem nobody's talking about

Here's where it gets more interesting, and more personal for me.

For much of human history, people found meaning and identity in two primary places: community and faith. Church, or the equivalent in other traditions, gave people not just a belief system, but a sense of belonging, ritual, and purpose that extended far beyond daily work. Then, gradually, for a lot of people in the Western world, that faded. And something else took its place.

Work.

As Ram Dass would put it, our jobs became our somebody-ness. The question "what do you do?" became a proxy for "who are you?" Your title, your company, your output — the scaffolding of identity. And for a lot of high achievers, that scaffolding is load-bearing. Remove it, and you've got a real problem.

Now enter AI. Not just as a tool, but as something genuinely capable of performing significant portions of what many people do for a living. Economists and researchers have started using a term for what's emerging: the AI precariat — a growing class of workers in a state of precarious, AI-disrupted employment. The numbers are stark: an estimated 980 million jobs worldwide face high disruption risk, and 41% of employers intend to reduce their workforce by 2030. But beyond the economic displacement, what's consistently underreported is the identity displacement. The loss of meaning and purpose that work provides is, as the World Economic Forum put it, "a significant but often overlooked consequence."

I'm not talking about the headline fear of robots taking jobs. I'm talking about something more universal and more subtle: what is my purpose, if not my productivity?

The organizational blind spot

From an organizational standpoint, the conversation around AI is almost entirely focused on efficiency. And I understand why — the business case is easy to make. Bring in a consulting firm, map out all the tasks and activities across the organization, run the analysis, identify what can be automated, model the headcount reduction, present the slide.

But what that analysis almost never includes is the human side of the equation. Not in the performance sense — in the meaning sense.

When you automate a significant portion of someone's role, or eliminate it entirely, you're not just changing a job description. You're disrupting someone's sense of contribution, their daily structure, their professional identity, and in many cases their financial security. That's not a rounding error. That's a profound transition — and most organizations are completely unprepared to support people through it.

MIT Sloan Management Review put it plainly in a recent piece: "The gap between what HR currently is and what it could be has never been wider." There are two paths. One leads to a weakened HR function — more automated, more reactive, more focused on headcount efficiency. The other elevates HR into something more important: the architect of human transformation inside the organization. The uncomfortable truth is that most HR teams are currently on the first path without fully realizing it.

I'd really encourage people in HR and operations leadership to sit with this. Not just: how do we retrain people for new roles? But: how do we help our people make sense of this shift? How do we support the identity transition, not just the skills transition? These are harder questions, and they don't fit neatly into a workforce planning model. But they're the ones that will determine whether an organization comes out of this period with its culture and people intact — or fractured.

What fills the void?

Zoom out further, and the question becomes even bigger. And the market has already noticed.

Between 2022 and mid-2025, AI companion apps surged by 700%. Apps designed to alleviate loneliness, provide emotional support, simulate human connection. Research suggests they work — to a degree roughly comparable with real human interaction. Which, if you think about it, is both impressive and deeply concerning. Because what it tells you is that there's a loneliness gap already opening up — and we're rushing to fill it with more AI.

That, I think, is the wrong direction.

We are, at our core, herd animals. Our DNA was written in a world of tribes, fires, shared meals, and face-to-face conversation. We are wired for community. And one of the quietly underappreciated effects of the last few decades of technology — screens, social media, remote work, digital everything — is that we've drifted from that. More connected and more isolated at the same time.

AI could accelerate that drift. Or it could reverse it.

This is where I think the real opportunity lies — and it's one that the most forward-thinking organizations are starting to recognize. The shift isn't just from AI to more AI. It's from AI to EI — emotional intelligence. As machines absorb more of the execution layer, what remains distinctly and irreducibly human becomes more valuable, not less: judgment, empathy, creativity, trust, the ability to sit with someone and understand what they actually need. These aren't things AI can replicate. And they also happen to be the things that make life feel meaningful.

If we're willing to look up from our screens long enough to notice, there's an invitation here. To come back to something older and more essential — community, conversation, presence, nature, each other. Not as a rejection of technology, but as its complement. If AI does what it promises and genuinely absorbs much of the execution layer of work, then what opens up is something we haven't had in a while: space. Space to ask what we actually want to do with our time, our energy, our lives.

That, to me, is the most exciting possibility of all. And it has nothing to do with productivity.

Next
Next

Don’t give your IP to GPT