AI and job scarcity
  • BlogBlog

Abundance, scarcity, and the future of work in an AI world

  • FormatMichael B. Horn
  • FormatMarch 24, 2026

To understand what students should learn, it helps to have a view of what the future society will look like—and what it will value.

Using practical theories tested in the real world across different circumstances—as opposed to opinions or analyses based on data that does not yet exist—suggests there may be important twists ahead that should lead us to question whether a “world of abundance” could ever truly exist everywhere all the time.

Many in Silicon Valley who believe AI will lead to mass unemployment argue that if AI can perform most economically valuable tasks, demand for human labor will collapse, potentially producing a world of material abundance but little need for human work.

This world of abundance would, in turn, decimate our understanding of economics and render the traditional models and tools we use to manage the economy and work obsolete, as they are largely built around the notion of scarcity: how to allocate scarce resources among competing uses.

If you believe that a world of abundance will extend to everything, then there’s something to this line of thinking. But there’s still plenty to debate.

As my colleague and founder of Humanist Venture Studio, Allison Dulin Salisbury, noted recently in her interview with former presidential candidate and universal basic income advocate Andrew Yang, “little rigorous thinking exists on these topics.”

Part of the reason that there’s little research is that the data don’t exist. By definition, data only describe the past, and reliable patterns only emerge about the distant past. Our AI future still awaits.

Because of the backward-looking nature of data, historical predictions about jobs thanks to new technologies have also generally been wrong, and our understanding of past impacts from technologies can still be interpreted in very different ways. As a recent Anthropic paper, “Labor market impacts of AI: A new measure and early evidence”, by Maxim Massenkoff and Peter McCrory, noted:

“But the track record of past approaches gives reason for humility.

For example, a prominent attempt to measure job offshorability identified roughly a quarter of US jobs as vulnerable, but a decade on, most of those jobs maintained healthy employment growth. The government’s own occupational growth forecasts, while directionally correct, have added little predictive value beyond linear extrapolation of past trends. Even in hindsight, the impact of major economic disruptions on the labor market is often unclear. Studies on the employment effects of industrial robots reach opposing conclusions, and the scale of job losses attributed to the China trade shock continues to be debated.

That means that we can look at patterns from past technological revolutions—but the insight they offer is limited. It’s also true that AI feels different from past technologies in many ways.

Here’s where good, circumstance-based theory—statements of causality that have been tested in different contexts and have clear boundaries—can play an important role in making predictions about the future and managing accordingly.

Although it certainly won’t be perfect, the reality is we’re all already using different theories when we prognosticate about the future—consciously or subconsciously.

For example, when people argue that past technology revolutions have destroyed jobs but also created many others, and contrast that with the claim that AI will result in mass abundance and mass unemployment, they are invoking a theory. Whether it’s the right one that sheds light here is worthy of debate.

There’s another set of theories, however, that have been well tested and can help answer pieces of this question. The theories focus on

  1. interdependence and modularity,
  2. commoditization and de-commoditization, and
  3. their corollary—the law of conservation of attractive profits.

The theory of commoditization and de-commoditization starts with a simple premise. From Clay Christensen and Michael Raynor’s The Innovator’s Solution:

“Whenever [commoditization] is at work somewhere in a value chain, a reciprocal process of de-commoditization is at work somewhere else in the value chain. And whereas commoditization destroys a company’s ability to capture profits by undermining differentiability, de-commoditization affords opportunities to create and capture potentially enormous wealth. The reciprocality of these processes means that the locus of the ability to differentiate shifts continuously in a value chain as new waves of disruption wash over an industry. As this happens, companies that position themselves at a spot in the value chain where performance is not yet good enough will capture the profit.”

What’s the process that transforms something that is proprietary into something that is a commodity?

It’s the natural process of overshooting performance that occurs as companies strive to keep ahead of their competitors—and the resulting shift to a modular architecture, vertical disintegration, and horizontal integration. Modularity and horizontal integration—with a relative lack of differentiation—lead to widespread scale and relative abundance.

But here’s the flip side. The law of conservation of profits shows us that the big, proprietary value shifts to other parts of the value chain as this process occurs. From The Innovator’s Solution again:

“That’s because the process of commoditization initiates a reciprocal process of de-commoditization. Ironically, this de-commoditization—with the attendant ability to earn lots of money—occurs in places in the value chain where attractive profits were hard to attain in the past: in the formerly modular and undifferentiable processes, components, or subsystems.”

As commoditization sets in and prices collapse amid a sea of relative abundance, firms need to move upmarket to differentiate themselves. They do that by “finding the best performance-defining components and subsystems and incorporating them in their products faster than anyone else.” This means the suppliers of those components and subsystems are now on the “not-good-enough” side of the equation—and therefore need to reintegrate and create proprietary architectures that become performance-defining.

This then sets off another cycle of commoditization and de-commoditization.

This notion of commoditization and de-commoditization is analogous to abundance and scarcity.

When something that was previously scarce becomes abundant, adjacent steps in the value chain tend to become relatively scarce in ways we can’t easily imagine. This is sometimes described as value-chain inversion.

Another reason this occurs is that humans are insatiable, always seeking struggle and progress. Friction contributes to our sense of purpose. As the research on happiness and wealth shows, we don’t all of a sudden become content when we cross a certain threshold of material wealth.

What we’ve learned is that when one step in a value chain clamors for more and more of a particular good or service, the adjacent subsystem will initially be unable to keep up with that raw demand. And when it does become abundant, what once seemed abundant will once again appear scarce.

What those who see abundance everywhere in everything suffer from, in other words, is a lack of imagination, which is normal, because we tend to see the world through a static, rather than dynamic, lens.

Is my analogy from commoditization and de-commoditization apt for abundance and scarcity? Consider oil. Early on, the fuel that powered cars was scarce. When it became abundant, roads appeared scarce. When road networks became abundant, urban space became scarce. The process has continued.

The same has been true in publishing. Newspapers were previously scarce. Thanks to steam printing presses in the 19th century, they soon became abundant, printed in large volumes. Reader attention and advertising space became scarce. Over time, the Internet created an explosion in advertising. Attention and trust became scarce, making curation and allocation valuable.

The same has been true for music, from something that once required being physically present to something available everywhere through recordings, portable music players, and streaming. Our time—and live performances—now feel scarce.

When electricity made power cheap and flexible, the scarce resource stopped being energy and became the ability to redesign factories and workflows to use it. Once assembly lines and optimized factory layouts spread, the bottleneck shifted again—to management systems capable of coordinating large-scale industrial production.

Technology rarely eliminates scarcity—it simply moves it elsewhere in the system.

This happens in ways most of us can scarcely imagine in advance. As my friend and former Southern New Hampshire University president, Paul Leblanc, is fond of saying: If you had told the throngs of people camped out in Hoovervilles in the United States in the 1930s that just 20 years later, suburban houses and shelf-stable food would become abundant, they would have thought you were nuts. But this abundance meant that other things became relatively scarce.

When it comes to AI, we may already be seeing this abundance-scarcity dynamic at play. As a recent article in The Wall Street Journal headlined, “AI Isn’t Lightening Workloads. It’s Making Them More Intense,” suggests, as AI makes us more productive, people are using the freed-up time to do other work, not less work. From the article:

“‘It’s not that AI doesn’t create efficiency,’ said Gabriela Mauch, ActivTrak’s chief customer officer and head of its productivity lab. ‘It’s that the capacity it frees up immediately gets repurposed into doing other work, and that’s where the creep is likely to happen.’”

What this may suggest is that where there are opportunities for companies to grow, the work for humans may increase. People may spend far more time executing judgment, ethics, coordinating with other humans, and the like.

Indeed, while AI does many amazing things, the AI we use the most at the moment—built on Large Language Models—is predominantly trained on one, arguably two, parts of the human experience: language and sight. Humans have many other senses for navigating the world—touch, smell, and taste—to say nothing of our subconscious mind, gut, and more that we don’t fully understand.

Maybe it’s also true that AI will accelerate the demise of companies that are already shrinking or have a limited value proposition and runway, where the freed-up time means there is less work for humans to do because there are no opportunities to increase revenue and grow. Who knows.

But we can predict that when something becomes abundant, adjacent parts of the system will appear scarce in ways we can’t foresee. That will mean lots of turbulence, but also not endless abundance.

Perhaps the thing we have to fear most is our lack of imagination.

Author

  • Michael B. Horn
    Michael B. Horn

    Michael B. Horn is Co-Founder, Distinguished Fellow, and Chairman at the Christensen Institute.