• Culture Shock
  • Posts
  • The Best Companies Won’t Just Use AI. They’ll Train It on Culture.

The Best Companies Won’t Just Use AI. They’ll Train It on Culture.

The Wrong AI Debate

Most companies are having the wrong AI conversation.

They’re focused on the obvious questions: which model, which vendor, which copilot, which governance framework, and whether to build or buy. Those are real decisions, but they’re not the deepest ones. The deeper question is what these systems are actually learning from. Over the next few years, the companies that pull away will not just be the ones with the most AI in their stack. They’ll be the ones with the best inputs, the best context, and the clearest view of how people are changing before that change fully shows up in the numbers.

That’s where culture comes in, and most companies still underuse it.

Culture is still treated like a downstream function. It gets pulled in for campaign planning, brand positioning, or the occasional trends report that gets shared around for a week and then disappears. That framing is too narrow for the moment we’re in. If AI is becoming a decision layer inside the enterprise, then culture needs to become part of the intelligence layer feeding those decisions. Not because it sounds progressive, but because it captures shifts in behavior, values, language, and attention earlier than most formal systems do.

AI Adoption Is Real. Differentiation Is Still Up for Grabs.

This matters more now because AI is rapidly becoming table stakes. McKinsey’s latest State of AI research found that 88% of organizations now use AI in at least one business function, and 79% report using generative AI somewhere in the business. That sounds like a massive advantage until you look closer. Adoption is broad, but scaled value is not. A smaller group is actually translating AI into meaningful operating or financial impact, which tells you something important: access to the technology is no longer the moat. What matters is how well the technology is grounded in reality.

Menlo Ventures put numbers on how fast this is moving. They estimate enterprise generative AI spending reached $37 billion in 2025, up from $11.5 billion in 2024 and just $1.7 billion in 2023. When a category scales that quickly, differentiation based on feature alone doesn’t last very long. Capabilities spread. Interfaces get copied. Pricing compresses. The advantage shifts toward systems that are more integrated, more trusted, and better informed.

That’s why I keep coming back to a simple point: most companies are rich in data and poor in signal.

Most Companies Know What Happened. Fewer Know What’s Forming.

There is no shortage of reporting inside large organizations. Teams can tell you what sold, what underperformed, what the campaign did, where conversion moved, how the quarter closed, and which segments responded. Those systems are useful, but they are mostly built to describe what has already happened. They are not consistently built to detect what is forming. And if you care about where a market is going, that distinction matters a lot.

Culture is where people test the future in public. New language usually shows up there first. New definitions of value show up there first. New consumer identities, emerging norms, status signals, anxieties, rituals, and aspirations all tend to become visible in culture before they become obvious in formal research. By the time something is fully legible in a dashboard, it has usually been building for a while somewhere else.

That gap between signal formation and institutional recognition is getting more expensive because attention itself is getting more fragmented.

Attention Is Fragmented, So Signal Formation Is Too

Deloitte found that U.S. consumers now spend about six hours a day with media and entertainment, but that time is split across an increasingly fractured set of environments: streaming, social platforms, creator ecosystems, music, gaming, podcasts, short-form video, and everything happening in between. The total amount of attention is not the real story. The distribution of it is. When attention fragments, signal formation fragments too. The next important shift in your category is less likely to arrive as one big obvious trend and more likely to build across smaller pockets of culture before it hardens into a broader market behavior.

That’s one reason the old foresight model is starting to feel inadequate. It was built for a world where change moved more slowly, where trends were easier to name, and where organizations had more time to wait for validation. Today, that delay is costly. By the time a pattern feels clean enough to package into a report, a meaningful part of the opportunity has often already been priced in by faster operators.

Companies Want More Innovation, But Their Discovery Systems Still Lag

This becomes even more important when you look at how aggressively companies say they want to innovate. Deloitte’s 2025 consumer products outlook found that 95% of executives said launching new products or services was a priority, 80% planned to increase investment in product innovation, and nearly two-thirds said they were shifting more of that investment toward truly novel products rather than minor extensions. That all sounds ambitious and forward-looking. But if the discovery system feeding those decisions still relies mostly on lagging indicators, then speed just becomes a more efficient way to act on stale assumptions.

McKinsey’s work on consumer goods innovation makes that problem even more concrete. They noted that average first-year sales for new CPG pacesetters fell by 50% between 2012 and 2018, and in packaged food, only 25% of new brands and disruptive innovations launched by large CPG companies in 2013 were still around four years later. You can read those numbers as an innovation failure, but I think they also point to a discovery failure. A lot of organizations want to build what’s next while relying on systems designed to validate what’s already familiar.

The Enterprise Is Getting Smarter. That Doesn’t Mean It’s Getting Wiser.

This is why I think the future of foresight is bigger than foresight itself. We’re moving into an environment where enterprise workflows will increasingly be shaped by agents, automation layers, internal copilots, and external data streams stitched together across the business. Gartner predicts that by 2028, 33% of enterprise software applications will include agentic AI, and 15% of day-to-day work decisions will be made autonomously. That is a significant shift in how organizations will sense, decide, and act.

But this is also where a lot of companies are going to get it wrong. Reuters reported on Gartner’s expectation that more than 40% of agentic AI projects will be canceled by the end of 2027 because of high cost, unclear value, or inadequate controls. That feels right to me. A lot of teams are rushing to automate decisions before they’ve done the harder work of improving the context behind those decisions. Faster systems do not automatically produce better judgment. In many cases, they just scale weak assumptions more efficiently.

Why Culture Belongs Upstream

That is why culture belongs upstream.

Not because it makes a strategy deck feel more current, and not because every company suddenly needs a trends function. It belongs upstream because it is one of the best ways to detect change before the market fully codifies it. When cultural intelligence is used well, it sharpens how companies understand emerging behavior, changing value systems, identity shifts, and the communities that often shape demand before the mainstream catches on. It gives organizations a better chance of seeing not just what people are buying, but what is becoming true about the people doing the buying.

That’s the role culture needs to play inside modern organizations. It should not sit off to the side as commentary. It should function as a discovery layer that informs product thinking, innovation priorities, positioning, content systems, and strategic planning. If companies are building AI-powered workflows that ingest internal data, vendor data, and operational data, they also need a structured way to ingest external human context. Otherwise, they end up with very sophisticated systems that are still reading the market late.

What Smart Companies Will Actually Do

The companies that get this right will ask better questions than “how do we deploy more AI?” They’ll ask what kinds of human signals they are systematically learning from, how early those signals become visible, how they separate noise from directional change, and how that learning actually shapes decisions. That is a much harder problem than buying software, but it is also a much more durable one.

The real opportunity is not just to automate work. It is to build organizations that can detect meaningful human change earlier and respond to it with more confidence. In that world, culture is not a nice-to-have. It is part of the sensing infrastructure.

That’s why the best companies won’t just use AI. They’ll train it in culture.

Because in a market moving this fast, being data-rich and signal-poor is one of the easiest ways to look sophisticated while missing the shift.